MAYA
The Zerone
Published in
7 min readJan 5, 2023

--

Updating the paradigm of photography: The James Webb Telescope

Every so often, the future feels like an unknown territory where we have no choice but to enter. In that sense, our memories and experiences from the past seem to be the only things we truly own. However, time has a way of erasing our experiences and corrupting our memories the further we pace into the future. Consequently, we naturally tend to strive for some form of proof of the present as it happens, which we could hold on much longer. Thus, photography was invented. In this regard, a photograph can be seen as reminiscent of the past. Our first attempt at photography dates back to the early human civilization. Cave paintings, which don’t quite fall under the same category of photography, served the same classical purpose of a photograph. Before we delve into more abstract concepts, let’s return to our main topic: the James Webb telescope. In the following paragraph, we’ll take a brief look at the history of photography and the processes used to capture images, both analog and digital. Then, we’ll move on to discussing the James Webb telescope in detail.

The first photograph captured in a camera and preserved to this day was taken in 1826 by Joseph Nicephore Niepce using a process called heliography, which involved exposing a light-sensitive material to light for several hours. Through subsequent refinements, this process evolved into a full-fledged process for capturing photographs using chemical development. This process involves creating a negative image on film, which is an image in which the colors are reversed. The negative can then be used to produce a positive print, which accurately represents the colors in the original scene. While chemical processes revolutionized photography, they also had their limitations, such as the inability to easily alter a photograph once it was captured on film and the high cost and risk associated with handling chemicals. In order to address these limitations, digital sensors, such as charge-coupling devices (CCDs) and complementary metal-oxide-semiconductor (CMOS) sensors, were developed to capture images in a more convenient and cost-effective way. These sensors work by converting light into an electrical charge, which is then transferred and read out as an image. The evolution of digital cameras greatly impacted the field of photography.

The James Webb Space Telescope, Credits: NASA/Chris Gunn

The evolution of photography, from heliography to digital sensors, has been marked by a constant quest for improvement and innovation. The James Webb telescope embodies this spirit of progress, with its cutting-edge scientific instruments and ambitious mission to study and image cosmic objects and phenomena. After nearly 3 decades of development, the telescope has been finally launched by NASA and successfully deployed in its targeted orbit. Unlike the Hubble telescope, which orbits above Earth’s atmosphere(which the James Webb is often thought of as a successor of), the James Webb orbits the sun approximately 1.5 million kilometers away from Earth at a point called Lagrange point (L2), where it can remain in a stable position relative to the sun. Its primary mirror is over 6 meters in diameter, making it the largest space telescope ever built. It is equipped with a suite of advanced instruments, which allows it to capture images with unprecedented clarity and resolution.

One side of the telescope always face the sun, while other remains in the dark. The side facing the sun has a sunshield consisting of five layers of plastic coated with metal, which is designed to be extremely effective at preventing light and radiation from reaching the other side of the telescope where the imaging equipment is located. However, some of the telescope’s instruments, such as the Mid-Infrared Instrument (MIRI), need to be kept at an astonishing temperature of just 7 Kelvin and require a separate cryogenic cooler to maintain this temperature. This low temperature is necessary because, as its name suggests, MIRI detects infrared light, but hot objects also emit infrared light. If the instrument itself gets too warm, it can interfere with its own observations. Unlike the Hubble telescope, which can detect visible and ultraviolet wavelengths, the James Webb is specifically designed to capture the infrared region of the spectrum because it can look tens of billions of light years away into deep space, where the light from distant objects has been redshifted to the infrared due to the expansion of the universe. Additionally, infrared light, with longer wavelengths than visible light, can penetrate dust and gas clouds, enabling the telescope to observe objects that are hidden from view in visible light, such as the process of star and planet formation and much more.

The incoming light from deep space first reflects off the primary mirror. This mirror consists of 18 hexagonal segments with a reflective gold coating chosen for its high infrared reflectivity. The primary mirror is designed to precisely align the incoming light and focus it onto the secondary mirror located just above it. The secondary mirror then reflects the light down through the primary mirror’s central aperture and into the telescope’s scientific instruments for analysis. It consists of four primary instruments: the Mid-infrared Instrument (MIRI), the Near-infrared Camera (NIRCam), the Near-infrared Spectrograph (NIRSpec), and the Near-infrared Imager and Slitless Spectrograph/Fine Guidance Sensor (NIRISS). Each of these instruments has specific combinations of components, sampling techniques, and range of detection of wavelengths, using a combination of which the telescope makes sensitive measurements and observations. As mentioned earlier, it captures images in the infrared region of the electromagnetic spectrum. It uses the NIRCam for the near infrared region and The MIRI for the mid infrared region. They have detectors similar to CCD sensors mentioned before, but they are designed to be sensitive in the infrared region. While the inclusion of two infrared regions for imaging may seem redundant, both serve a specific purpose. The near infrared region allows us to see through dust and gas clouds, while the mid infrared region allows us to see the infrared light reflected by the dust and gas clouds, making them visible. For comparison, the left image was taken using NIRcam and the right image was taken using MIRI.

SOUTHER RING NEBULA source: https://webbtelescope.org

Besides, the telescope is also equipped with a spectrometer on board, the Near Infrared Spectrograph (NIRSpec), which works by separating the light from a celestial object into its individual wavelengths. The light is then focused onto a detector, which measures the intensity of the light at each wavelength. This data is used to create a spectrum, which can then be used by astrophysicists to learn about the composition, temperature and motion of celestial objects. It even has a potential of detecting biomarkers in the atmosphere of exoplanets, which are substances mainly produced by living organisms (such as oxygen or phosphine gas.)

The telescope communicates with ground stations where its data is further processed and analyzed. It may seem puzzling to some that, although the telescope captures images in the infrared spectrum, the resulting images are vibrant and colorful, like the one shown below. How is this possible? The telescope uses tens of narrowband filters, each detecting a different wavelength of infrared light. These wavelengths are then assigned colors from the visible range of the spectrum, with shorter wavelengths being shifted to blue and longer wavelengths being shifted to red. The resulting image is then combined and, after some post-processing, it becomes ready for viewing. It is important to note that these images are technically still representative of the real data, but have been shifted up the spectrum to allow us to better understand the depth of the image.

CARINA NEBULA, 7,500 light years away credit The James Webb

It is interesting to observe how, throughout the evolution of photography, the definition of a photograph has remained constant: a “reminder of the past.” However, the meaning of this definition has changed over time. Photographs were considered a way to remember the known past, a past that we ourselves had experienced. Now, with the advanced capabilities of modern telescopes like the James Webb, photographs are used to remember and study the unknown past. Its advanced instruments allow us to look back in time, almost like a window to the early universe. This represents a significant shift in the field of photography.

As a concluding remark, the James Webb is a significant milestone in the field of astronomy. Its ability to observe deep space with unparalleled clarity has the potential to reveal insights about some of the most mysterious and poorly understood phenomena in the universe, such as the formation of early galaxies, the existence of black holes, and the presence of dark matter in the early universe. Since its launch in the December of 2021, it has made several iconic discoveries, including the detection of some of the earliest galaxies. It is hoped that throughout its lifetime, it will continue to expand our understanding of the early universe and fill in many gaps in our current knowledge, while fulfilling the classical purpose of photography i.e reflecting the past.

--

--