[SwiftUI] AsyncImage that can perform downsampling

Fumiya Tanaka
4 min readFeb 3, 2023

AsyncImage is a SwiftUI component to show Image fetched from internet resource. It is available since iOS15 like the following.

AsyncImage(url: url) { phase in
switch phase {
case .empty:
ProgressView().progressViewStyle(.circular)
case .failure:
Text("Error")
case .success(let image):
image
.resizable()
.frame(width: 56, height: 56)
@unknown default:
fatalError()
}
}

Such a simple way is beneficial to build a UI layout with ease. but when it comes to the possibility to use AsyncImage on production level, I think we can’t use AsyncImage directly. This is because there are a lot of limitation in AsyncImage.

One of the downsides to use AsyncImage is that we can’t manage any processes to generate Image because the successful result type of AsyncImage is Image, not Data nor CGImage. I guess the puropose is to hide the process so that we can make it easy to use but such an idea also prevents us from customizing the process.

In this article, I have extended the possibility to use AsyncImage in production by developing AsyncDownSamplingImage to make it possible for AsyncImage to perform donwsampling.

AsyncDownSamplingImage is a SwiftUI component that has similar interface to original AsyncImage and can perform downsampling so that we can reduce the memory buffer to store image data fetched from a server.

Why downsampling?

If you are not familiar with downsampling, please read below article beforehand.

Basically, if you show high resolution image, we need to assign many amount of memory to store it even if you are going to show it in small sized Image.

For example, let’s say that you are going to fetch a image whose size is 1024×1024px and show it on a ImageView whose size is 64×64px. In this case, size of stored buffer for image becomes the following.

1024 × 1024 × 4 [bytes] = 4.2 [MB]

Buffer size of image does not depend on ImageView size so there is no way to reduce it when we use AsyncImage.

Some might think that 4.2 MB is not so big, but if the number of image (like 100 images) is getting larger, they will feel the problem with performance due to so many memory usage (ideally, it becomes 420 MB).

How to avoid? so let’s perform downsampling!

How to use AsyncDownsamplingImage?

Almost same as AsyncImage which is in original SwiftUI component.

I show an example here.

@State private var url = URL(string: "https://via.placeholder.com/1000")
@State private var size: CGSize = .init(width: 160, height: 160)

...

AsyncDownSamplingImage(
url: url,
downsampleSize: size
) { image in
image.resizable()
.frame(width: size.width, height: size.height)
} fail: { error in
Text("Error: \(error.localizedDescription)")
}

To know more detail, please check my repository on GitHub (if you prefer this, please give me a star to make me happy :)

Actual performance comparison to AsyncImage

As a comparison to AsyncImage, I have prepared one example that shows 1000×1000 image on 160×160 SwiftUI Image on LazyVGrid.

Let me attach the result below.

Case to use default AsyncImage
Case to use AsyncDownSamplingImage

As you can see, using AsyncDownSamplingImage can reduce memory use by around ×2~3 !

Moreover, the more the number of images increases, the more we can get the benefit.

Below is a comparison when I scrolled and show 1000 high resolution images (1000×1000px). With AsyncDownSamplingImage, we changed Image size 1000x1000 into 160x160 which is same size as rendered Image.

Memory use for 1000 Default AsyncImages
Memory use for 1000 AsyncDownSamplingImages

Conclusion

Default AsyncImage is very simple way to fetch a network Image from a server. but it is also trade-off with customization. One of the dowsides is it can’t perform downsampling.

I think this is just one of the downsides so there might be another thing we can improve AsyncImage. I hope Apple will improve the API to make it possible for developer to customize it.

Reference / Repository

--

--