Patterns for accessing code from Dynamic Feature Modules

Wojtek Kaliciński
Jun 4, 2019 · 9 min read

In this post I am going to show you how to solve a common problem that arises when using dynamic feature modules in your project: code defined inside them is inaccessible to your base application during compilation… unless you use reflection 😱.

Just the mention of reflection is enough to scare some developers away, but there are a few elegant solutions to safely access what you need using at most one reflection call… and a clever optimization technique to get that number down to zero!

Read on to learn about different approaches you can take: ServiceLoaders, reflection, and how it can work with Dagger2 in a sample modularized app, available on GitHub.

Why dynamic feature modules?

Last year, we launched Android App Bundles and Google Play’s Dynamic Delivery to help developers reduce app size and streamline the release process. Find out more about the benefits of app bundles.

But switching to app bundles is just the first step. The new format opens up new possibilities for modularizing your code base to deliver an even more customized experience to users.

At Google I/O 2019, we announced general availability of on-demand and conditional modules, letting you decide when certain parts of your app are installed.

For on-demand modules, you use the Play Core API to install new code and resources when the user navigates to an optional feature in your app; and for conditional modules, their availability will be determined at install time by the user’s device configuration, such as hardware and software features, user’s country or the Android OS version.

What these new, customizable types of modules have in common, is that they both use the Gradle plugin. We’re going to call them dynamic feature modules, or DFMs. When you distribute your app as an Android App Bundle, Play Store will deliver these modules to user’s devices as separate APKs at the appropriate time.

From libraries to dynamic feature modules

Migrating some of your library modules to DFMs requires an important change in your project structure, that is inverting the dependency between the base and any dynamic features.

Image for post
Image for post
Dynamic feature modules (top) depend on base application. Application depends on regular libraries.

In the standard project structure that uses libraries, the base module depends on modules, which means you can use any classes defined in the library from the base module freely.

With DFMs however, the base is a dependency of modules, which means you can use any classes defined in the base module and its libraries in the DFM, but you can’t reference any code defined in the DFM from the base application at compile time.

Once the APKs produced by the DFM are installed with your application (regardless of the delivery mechanism: install-time, on-demand or conditional), their code is available to the ClassLoader at runtime.

Remember to always use SplitCompat from Play Core Library when you have on-demand modules.

There are a couple of obvious solutions to this problem, depending on your use case. If all you need from your DFM is to launch an Activity, you can simply pass its name (as a String; you can’t use .class notation, because the class isn’t accessible at compile time) to startActivity:

This is the approach that we used in the dynamic features sample. You can also check it out if you want to learn how to use the Play Core Library to install on-demand modules.

In the following sections I want to show you how to load a custom class from a DFM (after it has been installed) and access it in the base module, using three approaches. You can find the full code for the example used in this article in the dynamic code loading sample.

Project setup

In the sample code that we’ll be looking at, the app contains a simple counter that the user can increment by pressing a button. The counter resets every time the app is relaunched, unless the user installs an on-demand module that contains code which lets the app save the state of the counter to storage.

Of course that’s not something that you’d do in a real app, but it gives us a simplified scenario to work with, that you can adapt to your own dynamically loaded code.

The sample code is organized such that common parts are found in the main source set, but each approach for accessing code dynamically is implemented in its own variant.

The base module, “app”, contains these shared classes:

  • MainActivity contains the UI,
  • AbstractMainViewModel contains the logic, but needs to be subclassed in each build variant to provide abstract fun initializeStorageFeature()

You can switch between the three approaches using the Build Variant switcher in Android Studio:

Image for post
Image for post

The storage module is a dynamic feature module configured for on-demand delivery. What we’ll be looking at in this article is how you can access a class that lives in the storage module from the base app module.

Splitting the interface from the implementation

To begin, we need to create an interface for our dynamic feature code. We define it in the app module, so that it can be used to directly call any functions implemented by the feature.

The interface for my storage feature is simple: it needs to be able to store and retrieve an integer.

Because we leave the implementation of that interface to be defined in the DFM module, the implementer is free to choose what their constructor will look like. Our storage feature will need a Context, and might even request some other dependencies (like a Logger for example), and the implementation might look something like this:

To make it simple to obtain an instance of this concrete StorageFeature implementation and leave the instantiation to the implementer, we will also define a storage feature Provider interface with a single get() method that, given the necessary dependencies, will return a ready to use StorageFeature:

Again, we will put the interface in the base module, while the DFM will contain the implementation.

You can check out the interface definition here, with a small difference: I’m using another interface to hold the feature’s Dependencies, which will be useful for my Dagger setup later on, but is not really required otherwise.

Approach 1: Direct reflection call

The most straightforward method to obtain the StorageFeature instance in our base app module is a single reflection call. If you’re checking out the sample while reading this, switch to the reflectDebug build variant.

Take a look at the first MainViewModel implementation in the reflect source set:

Notice that I’m using the standard Class.forName() to get the class handle of the Provider, but then Kotlin gives us a very nice way to get its singleton instance — objectInstance. This works because my Provider is declared as a Kotlin object:

If you’re not using Kotlin or don’t want to have an object, you can revert to the regular Class#newInstance() call.

Once you have a reference to the Provider, you no longer need any more reflect calls, you simply call get() and continue calling your feature’s code through the StorageFeature interface, with compile-time safety.

Approach 2: Leveraging the ServiceLoader

Our next approach: load an interface implementation using a ServiceLoader instead of using reflection directly. This can be especially useful if you want to load multiple implementations of the same interface.

Unfortunately, ServiceLoader comes with a serious performance impact on Android by default, which is why you shouldn’t use it unless you are on the newest versions of R8 (1.5.X branch, included in Android Gradle Plugin 3.5.0+) with code shrinking and optimizations enabled.

When working with ServiceLoader and R8 optimizations, you get the benefit of getting rid of reflection entirely in the final byte code.

Using ServiceLoader is fairly straightforward. Here’s our MainViewModel implementation (remember to switch to the serviceLoaderRelease build variant in Studio):

To get the Provider, call ServiceLoader.load() and pass in the interface that you’re looking for as the first argument.

There are three conditions you must meet to enable the R8 optimization:

  • You must call the two-argument version of load()
  • Both arguments must use class constants (.class in Java or in Kotlin)
  • You must not call any methods on the returned ServiceLoader other than iterator()

You can iterate on the returned ServiceLoader to get instances of any implementations of the interface that have been found.

The ServiceLoader knows where to look based on a file you have to put in the META-INF folder in the DFM. The filename has to match the interface name, and the file contents is the name of the implementing class.

ServiceLoader instantiates the Provider for you, so it has to have a no-argument constructor.

To verify that the R8 optimization worked, you can use the DEX Viewer in the APK Analyzer in Android Studio and search for references to the ServiceLoader class in the base DEX file. When R8 worked correctly, you shouldn’t be able to find any.

Image for post
Image for post
Example of DEX Viewer output when R8 optimization was not enabled or did not work: ServiceLoader.load() is still present in the file and is being called from our app’s code.

If you can still see a call to ServiceLoader.load() (as in the animation above) this means the optimization did not work, and you’ll be doing disk I/O operations on the calling thread and possibly freeze your app, so if you choose to use the ServiceLoader pattern, you had better make sure that you are enabling the optimization correctly.

Approach 3: Integrating with Dagger2

The third approach shows how you can integrate with Dagger 2 to help instantiate the object graph on the DFM side. Check it out in the daggerDebug build variant.

So far we’ve dealt with a very simple case, where the feature code only requires a Context and then is able to create and return a StorageFeature object.

In reality, your dependencies can be a lot more complicated and using a dependency injection framework such as Dagger2 can assist you in making sure the right dependencies are instantiated and passed to the DFM.

Because of the way the Gradle module dependencies are set up, we can’t use a subcomponent of our base Dagger component in the DFM. However, we can use another mechanism called component dependencies to get the required dependencies from the parent component. You declare the dependency in the feature component:

And we’ll use the StorageFeature.Provider implementation to instantiate the feature component and return the StorageFeature:

All that remains is a way to pass in the required dependencies. Because we want Dagger to take care of that as well, we can actually make our base component implement the Dependencies interface:

Now whenever you try to compile your app, Dagger will tell you if there are any dependencies missing. For example, I’m making sure the base component has a Logger object bound, so it can pass it on to the feature component as part of theStorageFeature.Dependencies

And finally, we have a custom @Provides method in the base module that will take care of getting the StorageFeature.Provider via reflection, passing in the required dependencies and caching the result:

This is of course one way of making Dagger work with the dynamic feature modules in your app, but a lot will depend on your current Dagger setup. Feel free to take the ideas from this sample or use whatever fits your current architecture.

For example, instead of caching the result in the Module, you can try using dagger.Lazy<StorageFeature> at the injection site.

Whichever approach you choose, you have to make sure that you only instantiate the StorageFeature object graph after the DFM APK has been installed in your app.

If your DFMs are delivered conditionally or on-demand, remember to verify that first with SplitInstallManager.installedModules.

Additional learning

Modularization is a very broad topic that has implications on much more than simple class loading, such as the overall architecture of your app and even the final UX that end-users will see. In this article I focused on solving one specific problem: loading code from DFMs via reflection.

If you want to learn more about modularizing your app and finding the right architecture, take a look at Yigit Boyar’s and Florina Muntenescu’s talk from Google I/O ‘19:

You can also check out the code changes that were made when applying techniques similar to those described in this article in a real app:

Furthermore, Ben Weiss published a more general article that is a case study on modularizing Plaid a while ago:

The dynamic code loading sample that this article is based on uses the Play Core Library. In order to enable on-demand module delivery from Play, you should get familiar with the API and implement it correctly as stated in the linked documentation pages.

Remember to subscribe to our Android Developers publication for more stories about Android, Play, Kotlin and more!

Android Developers

The official Android Developers publication on Medium

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch

Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore

Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade

Get the Medium app