Bumble Tech
Published in

Bumble Tech

How to use Composite builds as a replacement of buildSrc in Gradle

Gradle buildSrc approach has become standard for implementing custom plugins, tasks and specifying common configurations (like dependencies list and versions) but has one major flaw — it invalidates a build cache when it is changed. On the other hand, Gradle also provides an alternative composite build approach that lacks this flaw. In this article, I describe how to use composite build instead of and the challenges to expect from migration.

My experience of Gradle configurations

Gradle build system was introduced for Android Development together with Android Studio. Android Studio has not always been used. If you have been developing applications for Android for more than 6 years, you probably remember Eclipse with Android Plugin. Today this plugin is deprecated and unsupported.

Gradle and Groovy were absolute magic at that time (actually, they still are) and I just copy-pasted everything from StackOverflow. Because applications rarely had more than one module, so keeping everything in one file was absolutely fine.

I started using modules for external libraries whenever I wanted to fix or adopt something inside them. And I just copy-pasted all the magic from the application to the library’s one. It worked fine, except for the fact that everything needed to be duplicated, e.g. updating dependency versions in all modules. I later discovered that you can use root to define a common configuration of modules.

This felt much better but it still wasn’t perfect. Root had got too big and complicated to maintain. At the time when modularization was starting to become popular and we were splitting our app into , , , modules, I discovered a different approach: we could simply extract these functions into separate Gradle scripts and apply them.

This solution lacks autocompletion and you can’t even fix it. In the approach at least you can fix missing autocomplete by using block, but this block is unavailable in all other script files except .

And now, years later, a lot of developers, me included, are using buildSrc to manage common configurations. After so many painful years using in our projects is a blessing. You can use any JVM language, you have full autocomplete and IDE support, and you can even write tests: unit tests with JUnit or any other framework, integration tests which actually start a separate instance of Gradle with a test environment provided. Could it be that at last, we have found the Holy Grail of Gradle configuration?!

Drawback of

Unfortunately, all these cool features come with one massive drawback. Any change inside completely invalidates the build cache. It also invalidates remote build cache, in instances when you are using it. Whilst it’s not really a problem for small projects, big ones with hundreds of modules are affected badly. On the other hand, changes in Gradle script files don’t invalidate the cache but merely invalidate some tasks.

Imagine the following chain of cacheable tasks: (Java plugin) -> (our custom task). task has a type of , which we receive from the built-in Java plugin. is our custom task, which can be defined in 2 different ways: inside and inside . Now we are going to make any bytecode affecting change inside the task class. With the approach both and tasks will be executed again, even if class, inputs and outputs have not been changed. With the approach, only the task will be executed again. Inputs, outputs and bytecode have not been changed for , so the result can be taken from cache. Gradle can’t verify that task will produce the same outputs, that is why it is launched again and build cache is ignored. So, how can we fix it? We absolutely don’t want to go back in time and lose all the ’s fancy, cool features for the sake of maintaining build speed.

Composite builds

Generally speaking, a composite build is one that includes builds with different root projects. If you have a simple multi-module project, then all subprojects share a common configuration defined by root . But there is a way of adding another project without affecting it and this is by using a custom configuration and building it in isolation. It can be useful to build external libraries in this way, as well as parts of your project that are completely independent from each other, but it can also be used for Gradle plugins. You can reference plugins by id from included build in your main project.

If we extract our configuration logic from the folder, then classes from the included build aren’t threatened as part of and Gradle don’t invalidate build-cache on every change. This is happening because configuration logic is provided to the main project as an external dependency (the same way as other plugins, e.g. Android Gradle Plugin). This means that Gradle now can correctly verify task inputs and outputs and can use build cache.

It is important to add that the following changes affect only build cache. Build cache contains serialised outputs of tasks with keys defined by inputs and classpath. When task results are taken from the cache, you will see status of the task. Incremental builds are not affected by this change and tasks will be invalidated anyway. Before launching a task, Gradle can check if task inputs and outputs have been changed since the last run. In the event that they have not, you will see status of the task.

Migration from buildSrc to Composite build

Now, I am going to show the migration process of our Reaktive library to composite build. This project is a very good example for the following reasons:

Indeed, we have all the 3 approaches as described in the first part. Here I will show how to deal with each of them and convert them into separate plugins.

Copy

The first step is quite simple. Let’s just copy our folder to folder. If you don’t use plugins in your folder, then now is a good time to start. Without any plugins, classes from your new module won’t be loaded into the build script classpath. Don’t remove your original yet. If you do, you won’t be able to sync the project. To inform Gradle of the new module that is included, we add the following into :

The first line is added for our convenience and to remove the block. By using the function we are telling Gradle to treat the project inside folder as included build. This project will be built on demand.

Plugins migration

So, how do we use it? First, let’s add our plugins definitions.

will create corresponding files for plugins, so you can remove them. You can read more about here.

If you don’t yet have any plugins and are only using for dependencies management, you need to create an empty fake plugin and apply it to the project to make classes available in the build script.

Once you apply to the project, class becomes available. And autocompletion will work in the same way as before.

Common functions migration

Inside we had setupMultiplatformLibrary function.

This function defines some common configuration for all modules. We applied the Kotlin Multiplatform plugin and declared some essential dependencies.

To convert this function into a Gradle plugin, we need to specify a dependency on Gradle plugin and create our custom plugin, which does the same setup.

In addition, we have parameterised setup for setupAllTargetsWithDefaultSourceSets with parameter. Kotlin coroutines do not support target, but we do. That is why we should avoid configuring this target just for the coroutines interoperability library. To do so, we could filter out , but we don’t want to hardcode it. Instead, we can implement it with the extension approach like other plugins.

Unfortunately, we can’t use the opt-out system here (), because Kotlin Gradle plugin does not handle target removal, but only target addition. However, we can apply configuration with the help of plugin and its extension.

And autocompletion works as expected.

External script file migration

We use Binary Compatibility Validator Plugin which I described in the following article. Configuration for it is defined inside binrary-compatibility.gradle and applied to root . Basically, all it does is apply plugins and configures the modules to ignore.

And we can simply convert the following build script into a plugin using the approach described previously.

Next, we can apply our new plugin inside root .

Dependencies

With this new approach, we need dependency management both in the main project and included builds. In included builds, we use different plugins as dependencies to ensure access to related classes such as project extensions, tasks etc. For this purpose, we will create additional included builds in order to manage dependencies.

Now we can create a class for all the external dependencies we have. You can check implementation here. Add a new module to using . Now the plugin and class can be used in any project.

While implementing this approach I discovered that if I use included build as a dependency inside another included build, IDEA does not resolve its classes in the editor(although it still compiles without errors). To fix this, I manually included class into compilation so that it will be available wherever any plugin on is applied. Quite a dirty and unstable hack, but I hope that fixes in future releases will make such misbehaviour unnecessary. Once fixed, using regular notation will be sufficient.

plugin can be used in the main project modules in the same way as described above.

Drawbacks

The only difference between using composite build instead of is the availability of classes without a corresponding plugin.

The real difference appears when we use block instead of or direct configuration functions call. The main advantage of using block is autocompletion inside Groovy scripts. You will have access to classes related to specific plugins and extensions. But you cannot be completely sure that extensions are fully configured at the moment when you apply a plugin. For example, say you had the following setup:

Custom plugin will just print in the console in the configuration phase. In this implementation, it will be . Now let’s migrate it to using block.

In this case, you will see in the console output, because is applied before configuration. There are a couple of ways to fix this:

1. Use or static functions to set up the build. It is OK to continue using it if it is hard to migrate. The only thing you will lose is autocompletion support in Groovy scripts. Also, remember you need to apply a fake plugin to load corresponding classes.

2. Use the block inside the plugin. But be careful: if you abuse it too much, your blocks will start to depend on other blocks and the order of their execution.

3. Try to convert your logic inside the plugin by using lazy API, tasks and other mechanisms. This requires good knowledge of Gradle API, but this way you will create reusable and independent plugins.

Conclusion

Composite builds can be used as a replacement of to avoid Gradle cache invalidation. Migration using the proposed approach is both straightforward and painless. Included builds can use plugins from other included builds and depend on each other. To see the full power of autocompletion in your Groovy scripts you can use block if you like. When you have no plugins, just create an empty fake plugin and apply it to load classes from the same module.

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store