The Relationship Of Image Quality And Image Resolution
When talking about image quality, the resolution often comes up in the conversation. Resolution of course, refers to the size of an image measured in pixels (picture elements). When you multiply the image resolution width and height, it gives the total number of pixels in an image.
The quality means the image’s representation of details that are stored in the pixels, like the color, shadows, contrast, etc. I have heard some people explain that higher resolutions improve the quality of an image. That means that if you have a low resolution shot image, increasing its resolution would also increase its quality. Does that really make any improvements to the image’s quality?
I have a photo shot with a camera resolution of 1280 x 960 pixels. The original image is displayed along with a zoomed in view of 646%.
When zoomed in, the details are still noticeable to show the scene. Although the image becomes more pixelated and blurry as the image is zoomed, the quality still looks fine though not that great.
Will the quality in the image’s details look much better if we were to increase its resolution?
Now, the original image will be up scaled to a resolution of 3800 x 2850 pixels. This uses a bicubic interpolation algorithm at 300 PPI. This image will be displayed with a zoomed in view of 646%.
The quality itself does not show any improvements when increasing the resolution. While it looks fine at its maximum resolution, when you start zooming in the quality actually suffers. It looks muddier and more blurry. The colors also look more faded and the scene looks less coherent. An image captured with a lower resolution will not improve quality when scaled to a higher resolution.
So, image resolution does not determine the image quality?
Resolution and quality depends on certain things. To understand this better, digital images captured by electronic sensors (in DSLR and mirrorless cameras) use what is called a raster format. A raster format creates images using pixels (in digital imaging) or dots (when printing digital images). Raster files are created and stored on disk which can be retouched by photo editing software. Raster files (RAW) can later be compressed to decrease file size (JPEG), but with a tradeoff with details.
Once an image is captured in raster format, it stores all details in the pixels of the image. Thus, you really cannot add new details to improve the quality of an image by upscaling it to a higher resolution. Instead what happens is you actually add more of the existing information already stored in the pixels of the original image, through duplicating adjacent or the nearest pixel. For example if a pixel captured in the original image has an RGB value of “39,48,43”, it will still be the same when the image is upscaled. No new information is captured at all.
This is the reason why professional photographers and seasoned imaging specialists prefer to work with higher resolution images, because they have more pixels that store more information. More information means more details, therefore much better image quality than a lower resolution image. This shows in their work, so if this is done commercially it is important to have the highest quality image.
If a photographer shot the image in 8 MP instead of 32 MP, it will not look as great when it comes to print but it may not make a difference on the web. This is because most web content, like images, are not displayed in full resolution so the difference is not going to be noticeable. When printing though, the quality is really noticeable. This is why publishers give certain criteria to photographers when it comes to image resolution and quality.
Measure Of Image Quality
Resolution is determined by the ratio of pixels in proportion to image size. This is measured in PPI (Pixels Per Inch). A high resolution image will usually have more pixels to every square inch in an image. This is measured in which the number of diagonal pixels are taken as a proportion of the length of the diagonal line that goes through an image. The higher the PPI, the higher the image resolution and this also means the higher the image quality.
Dot pitch is a measure used to determine the sharpness of an image. This is measured in millimeters (mm) and a smaller number means a sharper image. When you have closer spaced pixels, the image will look much sharper. The dot pitch is the distance from the center of one pixel to the next pixel. A lower dot pitch is considered better image quality based on its resolution. For example a 1024 x 768 resolution has a dot pitch of 0.297. A 3840 × 2400 resolution has a dot pitch of 0.125. The latter would be much sharper than the former, thus having better image quality.
Compression is another factor that measures quality. An image in its original raster format is called the RAW file. The RAW file contains the highest quality in an image, so some photographers just use what is called a lossless format to preserve the image quality. An example of this is the TIFF file format, which also takes up the most storage space on disk.
The JPEG format, based on DCT (Discrete Cosine Transform) algorithms, allows further compression (lossy compression) of the image in order to reduce the file size. However, by compressing the image, quality is lost. The more compression applied to an image, the less quality preserved. JPEG became popular for web content in the early days of the Internet. Due to the smaller file sizes, it allows websites to load content faster when bandwidth is limited.
It’s The Device That Determines Quality
This is probably what most people are aware of. Having a Canon 5D Mark IV camera is definitely going to capture better images by default compared to a typical smartphone camera. The camera’s specifications determine the quality based on the lens, sensor size, image resolution, firmware features, image stabilization and image signal processor.
If anything, the sensor size is important because it is responsible for gathering the light to create the image. It is possible to have high resolution images like on smartphone cameras. However, the quality to an equivalent DSLR will not be as great because smartphone sensor size is smaller.
The Light Must Be Right
Lighting is something that may not be considered in image quality, but it is just as important as the camera. You need light to create the images. The best quality images shot by great photographers like Annie Leibovitz, Ansel Adams and Herb Ritts have one thing in common — good lighting. No matter how high-end a camera is, if you have poor lighting you won’t get high quality images.
Poorly lit images are horrible to edit because certain details cannot be recovered from shadows and grainy areas. Lack of light also produces blurry images which are not sharp and detailed. Colors also suffer in poor light, decreasing the overall quality of an image. Shooting an image intentionally even in poor light may be interpreted as artistic, but that is more for creative purposes and not best practice.
It Also Depends On The Display
Having a brilliant display that can reproduce the image is also probably the most important when it comes to viewing the highest quality in an image. A 4K display compared to a standard VGA display is a night and day comparison. When you try to view your 32 MP image on a VGA display, you won’t get good quality.
This shows that even high resolution images can have poor quality if your display is poor. You also won’t see a 1 MP image any better with a 4K display. This is why in post production studios, the editors require the best displays with resolutions of 5K and higher in order to produce the best and highest quality content.
The Answer
This can be confusing at first, but let’s break down what has been discussed so far.
- Image resolution and image quality are directly related when it comes to the time the image is captured (no post involved). For example, if you shoot high resolution you get a high quality image.
- Image resolution and image quality are not related when you are editing the image. For example, when you upscale a low resolution image, you will not improve its quality.
- The type of media used for showing the image is very important. The highest quality is best viewed on a high resolution display (e.g. computer monitor, TV, movie screen). Published images on print look best at the highest resolution available. Low resolution images displayed on a high resolution display will not exactly look much better. Low resolution images will appear fine on the web, but not in print.
- Compressing image files leads to a loss of quality.
- Overall image quality is determined by the camera or image capture device.
- Good lighting, higher quality. Poor lighting, lower quality.
Now there are new algorithms being developed that can upscale an image and at the same time add “new” details to improve image quality. Using AI methods of Machine Learning, researchers are testing imaging algorithms that use techniques of Deep Convolutional Neural Networks that use superscaling. This may soon become the norm, and in this case increasing image resolution does improve image quality. For conventional imaging (no AI involved), image quality is still very much determined at the moment of capture and its quality will depend on the camera’s specifications (e.g. image resolution, sensor size, etc.).