Investigating UIImage performance

Recently I have been working on optimising rendering images within the Zendesk SDK for iOS, after I noticed attaching too many images would cause the SDK to crash. The crash was always accompanied by a memory warning so the cause seemed pretty clear.

Profiling allocations immediately revealed something strange. There was a large spike in memory usage when the user initially selected an image to attach to a message. When the user leaves the screen we cache the image to disk. Reading the image from the cache did not cause the same memory usage spike.

After some quick investigation, I decided our custom image picker was not leaking horribly. I then began to look into the memory usage of UIImage. I was unable to find a write-up with all the information I needed. I hope this article can be of use to anyone facing the same problems.

Profiling UIImage

I had three aims when trying to understand the memory usage of UIImage:

  1. If there was any difference resulting from the ways a UIImage can be initialised.
  2. If rendering a UIImage caused memory usage to increase.
  3. If returning a UIImage from PHImageManager affected memory usage.

All tests were done on an iPhone 6 running iOS 10.

UIImage creation

Broadly speaking, there are two ways of initialising a new UIImage object: loading from a bundle by name, or from Data obtained elsewhere.

My test image was a 1.8mb JPEG. My test application used almost exactly 100kb of memory when idle after launch. All images were rendered inside a 240x128 image view.

Loading the image by name resulted in total memory usage of 175kb. Rendering the image used 456kb.

Loading the image as Data resulted in memory usage of 1.94mb. Rendering the image used 2.08mb.

From this two things are pretty clear:

  1. Loading an image from a bundle does not load all of it into memory.
  2. Rendering an image does increase memory usage when compared to simply storing it in memory.

iOS seems to be trying to use as little memory as it can get away with when rendering an image. I was expecting a huge increase in memory usage when rendering, akin to the image being uncompressed.

This did not explain the memory spikes causing the crash, so I kept digging.


PHImageManager provides methods for retrieving images and video from a device’s gallery. Mattt Thompson has a good article explaining why it’s awesome.

For this test I was using an image from the camera roll. I don’t know the size of the image file on disk, so I can only compare relative memory use before and after the image was loaded.

Much like creating a UIImage, there are two main ways you might use PHImageManager to get a UIImage: have it return a UIImage object, or the Data to create one.

Loading a UIImage directly increased memory usage by 35mb! To check this increase was caused solely by loading the image I checked the size of the UIImage object. This was done by taking the image’s bytePerRow times its height. The UIImage object was 35mb.

Memory use before loading image: 100kb. After loading image: 35mb

Loading the image Data increased memory usage to around 8mb. In both cases rendering the image increased memory usage by a little over 100kb, consistent with my earlier tests.

Memory use before loading image: 100kb. After loading image from data: 7.96mb

I had found my memory spike: loading a UIImage directly from PHImageManager.

At first I thought PHImageManager was reading the image from a raw format. However, the metadata returned with the image indicates it is a JPEG. My guess is PHImageManager uncompresses the image before returning it. A panoramic image can be over 100mb. The Data for the same image is only about 6mb. This may sound unbelievable but that math makes sense. A typical panoramic image on an iPhone is ~8000x3000 pixels (I’m rounding down). 4 bytes per pixel gives 8000 x 3000 x 4 = 96000000 bytes when completely uncompressed. The same image, when exported from photos, is a 6mb JPEG.

It’s easy to see how this can cause the OS to intervene and crash your app.


One last thing to check was UIImagePickerController. This is the quickest way to allow your users to add images from their camera-role into your app. Thankfully UIImagePickerController does not seem to be uncompressing the image before returning it.

Fixing the crash

To fix the crash we need to avoid rendering the full image. My solution is to fetch a thumbnail of the image for rendering in the UI, and the Data to send to the server.

The size of the thumbnail will depend on your app. I’ve used 500px x 500px for this test. Loading the thumbnail increased memory usage to 4.5mb. Subsequently loading the data increased memory usage to 7.57mb. I’m not sure why the total memory usage is the same as the data alone from earlier tests. I suspect iOS is doing some clever caching. It could make for an interesting article 😉

Memory allocation showing thumbnail and data fetch

I hope you found this helpful.