Google Developers
Published in

Google Developers

Developing for Android II
The Rules: Memory

[Previous Chapter: Understanding the Mobile Context]

The use of memory in an application can be the single-most important determinant of how well that application behaves, how good the user experience of that application is, and how good the overall device experience is as well. Memory factors include how large the application footprint is (in storage and when resident in memory) and how the application churns memory (causing garbage collections, which has an effect on runtime performance).

Avoid Allocations in Inner Loops

Allocations are unavoidable. But avoid them when possible, particularly in code paths that will be called frequently, such as during the drawing code that may execute on every frame that a view is rendered.

For example, animations may call the onDraw() method of your custom view, so you should avoid allocating objects in that method. Consider, instead, allocating cached objects that are used temporarily in places where otherwise a new allocation would be necessary for a temporary object. A typical example of this is code that allocates a new Paint object in onDraw() because calls to Canvas methods require a Paint object. It makes more sense to allocate a single Paint object just once for that custom view instance which is then used temporarily in that method.

Avoid Allocations When Possible

In the interest of avoiding constant, temporary allocations, there are several strategies to consider. Note that some of these are not considered great coding or API decisions in traditional Java development, but they are recommended for Android with the tradeoff of style vs. garbage. As always, use the tools to help determine whether any particular code is worth optimizing. If there is some section of code that is executed rarely (like when the user changes some setting), but which would benefit from a clearer style, then a more traditional layer of abstraction could be the right decision. But if analysis shows that you are re-executing some code path often and causing lots of memory churn in the process, consider these strategies for avoiding excess allocations:

Rect getRect(int w, int h)


void getRect(Rect rect, int w, int h)

Avoid Iterators

Explicit (e.g., List.iterator()) or implicit (e.g., for (Object o : myObjects)) iteration causes the small allocation of an Iterator object (with the exception of arrays, which can be used with the foreach syntax without causing an Iterator allocation). This single allocation is not a big deal in practice, but should be avoided in inner loops for the same reasons discussed above. Meanwhile, iterating more explicitly through the indices of a collection avoids any allocations:

    final int count = myList.size();
for (int i = 0; i < count; ++i) {
Object o = myList.get(i);
// …

Also, note that requesting iterators always causes an allocation, even on an empty list. So if you are using the foreach syntax on a Collection (for (Object o : myObjects)), you are causing an allocation even if that collection is typically empty. You should at least do an isEmpty() check on a collection first to avoid this needless allocation if you must use the foreach syntax.

Avoid Enums

Enums are typically used to represent constants, but they are much more expensive than primitive-type representations, in terms of the code size and the memory allocated for the enum objects.

An occasional enum is not a big deal in terms of the memory it consumes or its allocation costs. And Proguard can, in some situations where it can statically analyze all usages, optimize enums to int values. But enums become a problem when used widely across a large application or, even worse, when used broadly in a library or an API that is then used by many other applications.

Note that using the @IntDef annotation, which is supported by Android Studio and Gradle 1.3+, will give your code build-time type safety (when lint errors are enabled), while retaining the size and performance benefits of using int variables.

Avoid Frameworks and Libraries Not Written for Mobile Applications

It is tempting to use frameworks that you may be familiar with from other Java environments. For example, dependency-injection frameworks like Guice is a commonly used Java library. But since it was not written or optimized specifically for mobile, and for the constraints talked about in this document, then applications using libraries like this will suffer due to the problems described herein.

Also, note that if you are using only a small portion of a large library, you will tend to drag in larges amounts of unused code and bloat your application footprint unnecessarily. Although Proguard can strip out unused code in many situations, dependency graphs in large libraries can defeat this optimization (and requiring Proguard as a build dependency can significantly increase the build time for your application).

There are some libraries being introduced for mobile applications in recent years which might be worth looking into, but avoid picking up random libraries unless you know that they do not suffer from the practices than can result in bad Android applications.

Some of the problems that would be evident from using these generic, non-mobile libraries include increased memory consumption and thrashing. You should be able to determine the extent of these problems by monitoring the memory uses and garbage collection behavior of your app.

Avoid Static Leaks

While static objects can be a useful means to avoid temporary allocations, beware of using static variables to cache objects that should not actually persist for the life of the entire process. Note, in particular, that the lifetime of a static variable (which is equal to the lifetime of the underlying process) is not the same as the lifetime of an Activity. This misunderstanding can, and has, led to leaking Activity objects across configuration changes (in which activities are destroyed and recreated) with static maps that held onto activity instances (directly or indirectly). Activities are quite expensive and this kind of leak can quickly lead to your application, and the system, running out of memory.

Avoid Finalizers

Because of nuances with the Java language specification, finalizers require not just one full garbage collection, but two. This means that not only will the resources freed by the finalizer not be available until both of those garbage collections occur, but also that you are forcing the system to undergo two collections, which can be both costly and janky, depending on what else is happening in the system. There is a specific situation that requires finalization; when your object holds a native pointer. If this is not the case for your code, avoid finalizers completely.

If you do require finalizers, consider implementing the AutoCloseable interface and freeing native resources within the scope of their usage via the close() method.

Avoid Excess Static Initialization

Expensive initialization can cause performance problems at crucial times in your application’s lifecycle, such as startup, contributing to poor user experience. Perform initialization on demand to avoid loading code in memory until you actually need it.

Trim caches on demand

As of API level 14, ComponentCallbacks2 provides the onTrimMemory() callback that allows your app to release memory when the system is under memory pressure. The Doing More with Less video from Google I/O 2012 shows an example of using this approach with bitmaps in an LruCache.

Use isLowRamDevice

ActivityManager.isLowRamDevice() was added as part of the Svelte effort in KitKat to help developers detect when their application is running on a device with particularly constrained memory (currently, a return value of true typically indicates memory of 512MB or less). This condition should be used by applications to decide whether to disable features that require more memory than would reasonably be available on such a device.

Avoid Requesting a Large Heap

Applications can request a large heap from the system in the application tag of their manifest… but they shouldn’t. Requesting a large heap is a classic example of Tragedy of the Commons (discussed in Chapter I), as it might make sense for an application when considered on its own, but is the wrong decision for an app which is part of an overall device experience.

Requesting a larger heap may be necessary in some rare situations where the type of media content regularly needed by the user easily swamps the default heap limit. But applications that use this just to avoid having to more carefully manage their memory and resources are only causing problems for the overall user experience of the device. Applications requesting a larger heap will necessarily cause less memory to be available for other processes on the device, necessitating other applications being killed and restarted as the user switches between activities on the device.

Avoid Running Services Longer than Necessary

Every process on the system takes up limited resources. If you don’t need your service to run all the time (and ideally you don’t), shut it down whenever possible.

Android provides many mechanisms for ensuring that components only run for as long as necessary to perform a given function.

Optimize for Code Size

Lean applications are fast applications. The less code you have to load, the less time it will take to download your application and the faster your application will start and initialize. Here are a few suggestions:

[Next Chapter — The Rules: Performance]

Engineering and technology articles for developers, written and curated by Googlers. The views expressed are those of the authors and don't necessarily reflect those of Google.