Modularize the Qonto iOS app for 50% faster builds

Marine Commerçon
The Qonto Way
Published in
10 min readJun 14, 2024

If you’re a mobile engineer or have ever talked to one, you probably know that build time can be a major frustration. In a monolithic codebase, every code addition makes the build slower. Sometimes you might wonder whether it’s your machine having issues, if you’re becoming impatient, or if it’s simply ”one of those days.”

What if there was a way to transform this frustration into efficiency and harmony through a modular approach?

Animation representing the visualization of dependencies in the Qonto iOS codebase as it evolves with modularization. In the end, the well-defined clusters indicate low coupling and a good separation of concerns.
From monolithic chaos to harmonized modules: the metamorphosis of the Qonto iOS project. (Dependency graph visualization using Emerge)

A few years ago, the iOS team at Qonto was asking itself those very questions. To give us an objective overview of the situation, we implemented a tracker on volunteering engineers’ machines to anonymously report build times. And any doubts were settled: our build time was growing!

Each engineer waited almost 2 minutes on average to build the project. With each of us compiling the application approximately 10 times daily, this translates to a monthly total of more than 200 hours. Beyond impacting the team’s mood daily, it was affecting our efficiency, and one question was on everyone’s mind:

How could we maintain our feature delivery pace with a growing build time?

Considering the growth of our team (in the last 2 years we’ve gone from 25 to 65 iOS engineers) and the codebase expansion forecast, we estimated that we had just a couple of months to reverse the trend.

Our goal was simple:

Halve the build time and keep it as low, whatever the size of our project.

It was time to start a Kaizen: an initiative where engineers meet regularly to crack a problem with multiple root causes, iterating step by step.

Kaizen spirit

Now that our problem was clear, we met in the Kaizen spirit to analyze the reasons behind this growing build time. Among the solutions proposed, we opted first for those directly benefiting the build time, such as switching to M1 Pro SoC and fixing slow compiling code. But we knew it wouldn’t stop the bleeding. After a brief respite due to the changes we put in place, the compile time has drastically increased — as the graph below shows.

Graph representing the average time needed for each compilation of the iOS project over the past months. Drastically reduced at 10 months thanks to our short-term optimizations, but sadly not sufficient, as the build time continued to grow 5 months later.
Average time needed for each compilation of the iOS project over the past months. Drastically reduced thanks to our short-term optimizations but, unfortunately, not sufficient as the build time continued to grow soon after.

Our metrics helped us to follow the trend and predict the build time evolution. Compiling the project would take more than 5 minutes on average today if we had left the situation as it was.

Let’s do a quick calculation:

65 engineers * 10 compilations * (5 min estimated — 1 min target) * 21 days. That represents 910 hours lost in just one month.

The forecasted gain for the team would be massive. We were ready to consider more intensive options to implement a sustainable solution.

Time to modularize

Metrics confirmed that the more we introduced code, the more we had to wait for compilation. Removing some code was not an option. Splitting the codebase into several applications wasn’t either, as our clients expect a unified app to manage everything. Our last option was modularization.

In a modularized codebase, developers work inside modules to quickly compile the code related to a specific component. Combined with the caching mechanism of modules, this also leads to faster builds for the main application. And it’s only one benefit among others.

You may have a fair question then: Why didn’t we modularize earlier? Years ago, the team didn’t face issues with build time. Modularization could have helped us for light optimizations, but we had higher priorities to tackle first, such as improving our architecture and separation of concerns. We were slowly preparing for such a move.

We’d reached the right time to embark on modularization.

We knew we needed to allocate enough resources to improve the build time quickly and seamlessly. We read several articles presenting the benefits of modularization and its technical part. But we didn’t find much explaining how to do it when your project reached a million lines of code, and the production delivery never stops.

At that point, we still didn’t know how to do it or how long it would take.

We had 4 challenges to address:

  1. Define a strategic architecture.
  2. Estimate the refactoring.
  3. Support numerous engineers working in parallel.
  4. Don’t impact the workflow and delivery.

1. Define a strategic architecture

The ideal slicing for a modularized codebase is to strike the balance between granularity and cohesion. Modules should be small enough for manageability and reusability, yet cohesive enough to represent meaningful functionality.

Having this in mind, we also aimed for:

  • a quick impact on the build time,
  • empowering teams with their modules for faster workflow,
  • a seamless transition for our team.

Our first decision was to opt for slicing by feature, to encourage the ownership of modules in each team and keep it convenient for engineers.

If you’re curious about the tools we used to modularize the codebase, we’ll delve into the technical side in a follow-up article. We’ll explore how Tuist helps us maintain consistency, achieve scalability, and efficiently manage a large project with multiple targets, making it easier for our extensive team to collaborate.

Was it just a matter of extracting one module per feature?

Not really. Creating a module for a new feature is cheap, but extracting an existing one is a whole different story! You’ll have to deal with tightly coupled code and improve it first. To achieve our goal of quickly impacting the build time, our strategy was to extract large modules first (those with multiple features). We weren’t 100% confident with this choice but we planned to slice them again if needed.

Since our application is divided into tabs, the plan was to create a module for each and then specific modules for shared components. In the schema below, you can see how we extracted the Transfers tab.

Representation of how we split the code of the Transfers tab into modules. Most of it has been moved to the Transfers module, which contains different features related to transfers. One UI component, reused across the app, has been moved into a shared module.
Representation of how we split the Transfers tab code into modules.

With a better idea of the modules that we wanted to extract, we were finally ready to draft the roadmap. So, we said to ourselves:

“Ok, let’s start with the first module. How much time do we need?

Despite our in-depth knowledge of the codebase, this was the first time we had done such refactoring so it was impossible to guess how complex and long it would be to extract a module.

2. Estimate the refactoring

At Qonto, we never start an implementation without a deep understanding of where we are going. This enables us to keep control over the scope, estimate the effort required, and thus, better organize the tasks.

The goal here was to identify the simplest module to extract and estimate the effort needed. We decided to run a dependency visualization tool on our million lines of code. It produced the image below, where dots represent files and lines their dependencies.

Can you guess which one would be easy to extract? We couldn’t.

Dependency visualization of the iOS codebase before modularization, using Emerge. In the visualization, the overlapping clusters indicate a high degree of coupling and poor separation of concerns, suggesting that the codebase lacks modularity and well-defined interfaces.
Dependency visualization of the iOS codebase before the modularization, using Emerge. In the visualization, the overlapping clusters indicate a high degree of coupling and poor separation of concerns, suggesting that the codebase lacks modularity and well-defined interfaces.

As there was no obvious module to start with, we decided to focus on small features first to train ourselves. We asked our team to extract them in a quick and unconventional way, just to “play with the codebase.” Our goal was to reveal the complexity we’d have to face, to anticipate future needs.

Concretely

When moving a feature into a new module, Xcode will raise errors. Hundreds of errors.

Xcode displaying 1272 errors when building our Cards module, right after we moved the code inside.
Xcode displaying 1272 errors when building our Cards module, right after we moved code inside.

The code moved into the new module will try to access classes that are still in the monolith. This can be fixed by extracting part of the monolith into shared modules, to let the new module and the rest of the monolith use them. The goal of rapid exploration was to make the module compile without investing time into clean fixes.

Here are some tips from concrete actions we took:

  • import all dependencies: don’t lose time identifying which ones are really needed for the new module,
  • if not extracted yet, duplicate some of the monolith code in the module,
  • comment the code raising errors,
  • don’t extract all tests: one is enough to reveal the complexity.

The aim was to break down extraction into several tasks, each one being clearly defined based on its own complexity. Then we could work out the total time needed to extract each module.

These preliminary investigations have provided unexpected insights. We understood that certain extractions, like the shared components, had to take priority. Some others couldn’t be done in parallel, due to a large amount of coupled code.

We started to draft a roadmap, which was further completed as we progressed through the explorations.

This quickly raised a new question:

How could we best utilize engineers’ time if most extractions cannot proceed simultaneously?

Representation of our first roadmap, reflecting the challenge of moving fast with multiple engineers when extractions can’t be done in parallel.
Draft of our roadmap, reflecting the challenge of moving quickly with multiple engineers when extractions can’t be done in parallel.

3. Support numerous engineers working in parallel

To achieve our goal in a few months, we agreed to secure as many resources as possible for this refactoring. The great news is that we had up to 12 engineers working on it at the same time. But it also turned into one of the biggest challenges we faced.

If pairing is the common way to help engineers work together and speed up delivery, it was less applicable to modularization. All of our engineers confirmed they would move faster alone than in a pair and, ideally, with no other extraction in progress to reduce potential conflicts when touching the same files.

Do you remember the rapid exploration phase? Well, it paved the way for our work here. All the issues raised by the team were an endless source of improvements (scripts, mocks, automation…).

That’s how we came to the idea of dividing the team into four roles:

  1. raising issues via explorations,
  2. handling hard points raised by explorations,
  3. improving our tool,
  4. extracting features.
Representation of our roadmap, with an optimization of our task force thanks to various tasks to prepare extractions and improve our groundwork.
A better optimization of our task force thanks to various tasks aimed at preparing extractions and improving our groundwork.

Indeed, modularization isn’t just about crafting modules. Depending on your codebase and existing utility programs/tools, you may encounter various challenges to ensure seamless integration.

Launching full steam ahead into modularization, we needed to also take into account the rest of our team.

The first merges reminded us of our final challenge: not impacting them!

Message shared by an iOS engineer, alerting about the impact of the modularization’s refactoring on his feature.
An iOS engineer alerting about the impact of the modularization’s refactoring on his feature

4. Don’t impact the workflow and delivery

One of our initial goals was to ensure that the product and tech teams working on new features would not be disrupted or affected by our changes. However, the process of moving thousands of lines of code while your team delivers features inevitably produces merge conflicts in Git. And not the simplest ones.

Where possible, we prioritized features that were not updated simultaneously. Otherwise, we maintained constant communication among teams and let priority features merge first, taking responsibility for resolving any conflicts on our end.

While we were extracting modules, the team continued to ship new features and add more code to the monolith. That’s what we call technical debt: we don’t apply our best practices (new modules); instead, we continue to deliver features using the old (monolith) way, resulting in more refactoring later on.

When prudent and deliberate, technical debt can be the right way to go.

Let me explain why. Thanks to a survey sent regularly, we could figure out the complexity that modularization brought. If everyone was happy with the new build time, we now faced new challenges with dependencies and ambiguous errors. Issues related to the cache and wrong configurations took a lot of work to understand and fix for the team. We had to bring more stability and simplicity before asking them to create modules for new features.

In short, the cost of introducing technical debt was lower than the cost of having the team working with a complex solution.

Graph showing the evolution of the impact of modularization for engineers during the refactoring. It took around 6 months to reverse the trend and receive positive feedback.
Evolution of the impact of modularization for engineers during the refactoring process. Initially, there was a negative impact, which took around 6 months to reverse the trend and receive positive feedback.

After several improvements and a team happy again, it was the right time to launch various trainings and prepare for the last step: using modularization for new features.

Back in their team, the 25 engineers who participated in this huge refactoring have become mentors. They helped to spread the knowledge and smooth the transition.

Where are we now?

It took us 6 months to drastically reduce build time.

Build time reduction over the past months, following the progress of the codebase extraction into modules. Decreasing from 1.5 to 0.5 minutes.
Build time reduction over the past months, following the progress of the codebase extraction into modules

The build time is the main reason we started this huge refactoring. However, it’s not the only improvement we’re aiming for.

There are so many more implicit benefits that will drastically reduce our costs while improving engineers’ lives. Now that each of our team owns their own modules, we can easily:

  • improve module interfaces to optimize compilation,
  • extract UI-tests into modules,
  • detect modules updated to run only associated unit & UI tests on CI (Continuous Integration),
  • build a demo app for each to showcase how a module looks,
  • generate code such as mocks only for the updated modules.

When converted into saved hours, these improvements could potentially surpass those related to the build time. I’ll leave you to guess the total impact of modularization in terms of money saved!

Above all, we have a proud and happy team, excited and empowered to tackle all coming challenges.

Want to join the adventure? Qonto is hiring!

About Qonto

Qonto makes it easy for SMEs and freelancers to manage day-to-day banking, thanks to an online business account that’s stacked with invoicing, bookkeeping and spend management tools.

Created in 2016 by Alexandre Prot and Steve Anavi, Qonto now operates in 4 European markets (France, Germany, Italy, and Spain) serving 500,000 customers, and employs more than 1,600 people.

Since its creation, Qonto has raised €622 million from well-established investors. Qonto is one of France’s most highly valued scale-ups and has been listed in the Next40 index, bringing together future global tech leaders, since 2021.

Interested in joining a challenging and game-changing company? Take a look at our open positions.

Illustration by Eloïse Rulquin.

--

--