ST/ Saturn moon could support life in its subsurface ocean

Paradigm
Paradigm
Published in
24 min readJan 6, 2021

Space biweekly vol.17, 23rd December — 6th January

TL;DR

  • Using data from NASA’s Cassini spacecraft, scientists modeled chemical processes in the subsurface ocean of Saturn’s moon Enceladus. The studies indicate the possibility that a varied metabolic menu could support a potentially diverse microbial community in the liquid water ocean beneath the moon’s icy facade.
  • It took fifteen years of imaging and nearly three years of stitching the pieces together to create the largest image ever made, the 8-trillion-pixel mosaic of Mars’ surface. Now, the first study to utilize the image in its entirety provides unprecedented insight into the ancient river systems that once covered the expansive plains in the planet’s southern hemisphere.
  • Although stellar flares are typically viewed as a detriment to habitability, study shows ‘life might still have a fighting chance.’ Researchers find that flares drive a planet’s atmospheric composition to a new chemical equilibrium.
  • In 1983, theoretical physicist Pierre Sikivie found that axions have another remarkable property: In the presence of an electromagnetic field, they should sometimes spontaneously convert to easily detectable photons. What was once thought to be completely undetectable, turned out to be potentially detectable as long as there is high enough concentration of axions and strong magnetic fields.
  • A new study finds that neutron stars are typically about 11.75 kilometers in radius and provides a novel calculation of the Hubble constant.
  • Astrophysicists have for the first time observed a gas filament with a length of 50 million light years. Its structure is strikingly similar to the predictions of computer simulations. The observation therefore also confirms our ideas about the origin and evolution of our universe.
  • Scientists have trained machine learning software to classify supernovae without the traditional use of spectra. The project — the first to use real supernovae data to inform its artificial intelligence — is 82% accurate. Currently, scientists take spectra of 10-percent of the ~10,000 supernovae discovered each year. When the Rubin Observatory goes online, only 0.1-percent of the expected supernovae discoveries will be further studied without the new software.
  • SpaceX wins $150 million contract to launch Space Development Agency satellites.
  • Lockheed Martin gets $4.9 billion contract to build three missile-warning satellites for U.S. Space Force.
  • White House releases planetary protection strategy.
  • China’s CASC targets more than 40 space launches in 2021.
  • Upcoming industry events. And more!

Space industry in numbers

Last summer, the Space Foundation published the second-quarter findings of its 2019 issue of The Space Report, revealing that:

  • The global space economy grew 8.1% in 2018 to USA 414.75 billion, exceeding USD 400 billion for the first time.
  • Global launches in 2018 increased by 46% over the number of launches a decade ago.
  • Global launches in 2018 exceeded 100 for the first time since 1990.

Analysts at Morgan Stanley and Goldman Sachs have predicted that economic activity in space will become a multi-trillion-dollar market in the coming decades. Morgan Stanley’s Space Team estimates that the roughly USD 350 billion global space industry could surge to over USD 1 trillion by 2040.

Source: Satellite Industry Association, Morgan Stanley Research, Thomson Reuters. *2040 estimates

Space industry news

Lockheed Martin gets $4.9 billion contract to build three missile-warning satellites for U.S. Space Force

Space Force’s small launch program looks to pick up pace after a year of delays

China’s CASC targets more than 40 space launches in 2021

Virgin Orbit, Rocket Lab schedule first launches of 2021
Head of U.S. Strategic Command blasts GBSD critics: ‘Minuteman 3 cannot be life-extended’

Delta Air Lines adds Viasat in-flight connectivity service
AAC Clyde subsidiary Hyperion to fly CubeCAT laser terminal

SpinLaunch expands New Mexico test site
Spacety shares first images from small C-band SAR satellite
Pace steps down from National Space Council
India aims for reusable rockets, advanced propulsion in decadal spaceflight plan
Ten companies bid for NASA small launch vehicle contract
Puerto Rico government supports rebuilding Arecibo
Congress overrides Trump’s veto and passes the National Defense Authorization Act
SpaceX wins $150 million contract to launch Space Development Agency satellites
NSTXL’s contract to manage Space Force technology projects on hold pending review
White House releases planetary protection strategy
Soyuz launches French reconnaissance satellite in final 2020 launch
House overrides Trump’s veto of the National Defense Authorization Act
Viasat asks FCC to perform environmental review of Starlink
Brexit deal allows UK to continue participation in Copernicus
Defense bill prospects unclear as House prepares to override Trump’s veto
Dealing with dust: A back-to-the-moon dilemma
Voyager Space Holdings to acquire majority stake in Nanoracks
ESA Clean Space tackles space junk one component at a time
Space Development Agency to reevaluate proposals for missile-tracking satellites
SLS Exploration Upper Stage passes review
Omnibus spending bill gives Space Force its first separate budget
Raytheon completes acquisition of Blue Canyon Technologies

Space exploration

Oxidation processes diversify the metabolic menu on Enceladus

by Christine Ray, Christopher R. Glein, J. Hunter Waite, Ben Teolis, Tori Hoehler, Julie Huber, Jonathan Lunine, Frank Postberg in Icarus

Using data from NASA’s Cassini spacecraft, scientists at Southwest Research Institute (SwRI) modeled chemical processes in the subsurface ocean of Saturn’s moon Enceladus. The studies indicate the possibility that a varied metabolic menu could support a potentially diverse microbial community in the liquid water ocean beneath the moon’s icy facade.

Prior to its deorbit in September of 2017, Cassini sampled the plume of ice grains and water vapor erupting from cracks on the icy surface of Enceladus, discovering molecular hydrogen, a potential food source for microbes. A new paper published in the planetary science journal Icarus explores other potential energy sources.

“The detection of molecular hydrogen (H2) in the plume indicated that there is free energy available in the ocean of Enceladus,” said lead author Christine Ray, who works part time at SwRI as she pursues a Ph.D. in physics from The University of Texas at San Antonio. “On Earth, aerobic, or oxygen-breathing, creatures consume energy in organic matter such as glucose and oxygen to create carbon dioxide and water. Anaerobic microbes can metabolize hydrogen to create methane. All life can be distilled to similar chemical reactions associated with a disequilibrium between oxidant and reductant compounds.”

This disequilibrium creates a potential energy gradient, where redox chemistry transfers electrons between chemical species, most often with one species undergoing oxidation while another species undergoes reduction. These processes are vital to many basic functions of life, including photosynthesis and respiration. For example, hydrogen is a source of chemical energy supporting anaerobic microbes that live in the Earth’s oceans near hydrothermal vents. At Earth’s ocean floor, hydrothermal vents emit hot, energy-rich, mineral-laden fluids that allow unique ecosystems teeming with unusual creatures to thrive. Previous research found growing evidence of hydrothermal vents and chemical disequilibrium on Enceladus, which hints at habitable conditions in its subsurface ocean.

“We wondered if other types of metabolic pathways could also provide sources of energy in Enceladus’ ocean,” Ray said. “Because that would require a different set of oxidants that we have not yet detected in the plume of Enceladus, we performed chemical modeling to determine if the conditions in the ocean and the rocky core could support these chemical processes.”

For example, the authors looked at how ionizing radiation from space could create the oxidants O2 and H2O2, and how abiotic geochemistry in the ocean and rocky core could contribute to chemical disequilibria that might support metabolic processes. The team considered whether these oxidants could accumulate over time if reductants are not present in appreciable amounts. They also considered how aqueous reductants or seafloor minerals could convert these oxidants into sulfates and iron oxides.

“We compared our free energy estimates to ecosystems on Earth and determined that, overall, our values for both aerobic and anaerobic metabolisms meet or exceed minimum requirements,” Ray said. “These results indicate that oxidant production and oxidation chemistry could contribute to supporting possible life and a metabolically diverse microbial community on Enceladus.”

“Now that we’ve identified potential food sources for microbes, the next question to ask is ‘what is the nature of the complex organics that are coming out of the ocean?’” said SwRI Program Director Dr. Hunter Waite, a coauthor of the new paper, referencing an online Nature paper authored by Postberg et al. in 2018. “This new paper is another step in understanding how a small moon can sustain life in ways that completely exceed our expectations!”

The paper’s findings also have great significance for the next generation of exploration.

“A future spacecraft could fly through the plume of Enceladus to test this paper’s predictions on the abundances of oxidized compounds in the ocean,” said SwRI Senior Research Scientist Dr. Christopher Glein, another coauthor. “We must be cautious, but I find it exhilarating to ponder whether there might be strange forms of life that take advantage of these sources of energy that appear to be fundamental to the workings of Enceladus.”

The global distribution of depositional rivers on early Mars

by J.L. Dickson, M.P. Lamb, R.M.E. Williams, A.T. Hayden, W.W. Fischer in Geology

It took fifteen years of imaging and nearly three years of stitching the pieces together to create the largest image ever made, the 8-trillion-pixel mosaic of Mars’ surface. Now, the first study to utilize the image in its entirety provides unprecedented insight into the ancient river systems that once covered the expansive plains in the planet’s southern hemisphere. These three billion-year-old sedimentary rocks, like those in Earth’s geologic record, could prove valuable targets for future exploration of past climates and tectonics on Mars.

The work complements existing research into Mars’ hydrologic history by mapping ancient fluvial (river) ridges, which are essentially the inverse of a riverbed. “If you have a river channel, that’s the erosion part of a river. So, by definition, there aren’t any deposits there for you to study,” Jay Dickson, lead author on the paper, explains. “You have rivers eroding rocks, so where did those rocks go? These ridges are the other half of the puzzle.” Using the mosaic, as opposed to more localized imagery, let the researchers solve that puzzle on a global scale.

Mars used to be a wet world, as evidenced by rock records of lakes, rivers, and glaciers. The river ridges were formed between 4 and 3 billion years ago, when large, flat-lying rivers deposited sediments in their channels (rather than only having the water cut away at the surface). Similar systems today can be found in places like southern Utah and Death Valley in the U.S., and the Atacama Desert in Chile. Over time, sediment built up in the channels; once the water dried up, those ridges were all that was left of some rivers.

The ridges are present only in the southern hemisphere, where some of Mars’ oldest and most rugged terrain is, but this pattern is likely a preservation artifact. “These ridges probably used to be all over the entire planet, but subsequent processes have buried them or eroded them away,” Dickson says. “The northern hemisphere is very smooth because it’s been resurfaced, primarily by lava flows.” Additionally, the southern highlands are “some of the flattest surfaces in the solar system,” says Woodward Fischer, who was involved in this work. That exceptional flatness made for good sedimentary deposition, allowing the creation of the records being studied today.

Whether or not a region has fluvial ridges is a basic observation that wasn’t possible until this high-resolution image of the planet’s surface was assembled. Each of the 8 trillion pixels represents 5 to 6 square meters, and coverage is nearly 100 percent, thanks to the “spectacular engineering” of NASA’s context camera that has allowed it to operate continuously for well over a decade. An earlier attempt to map these ridges was published in 2007 by Rebecca Williams, a co-author on the new study, but that work was limited by imagery coverage and quality.

“The first inventory of fluvial ridges using meter-scale images was conducted on data acquired between 1997 and 2006,” Williams says. “These image strips sampled the planet and provided tantalizing snapshots of the surface, but there was lingering uncertainty about missing fluvial ridges in the data gaps.”

The resolution and coverage of Mars’ surface in the mosaic has eliminated much of the team’s uncertainty, filling in gaps and providing context for the features. The mosaic allows researchers to explore questions at global scales, rather than being limited to patchier, localized studies and extrapolating results to the whole hemisphere. Much previous research on Mars hydrology has been limited to craters or single systems, where both the sediment source and destination are known. That’s useful, but more context is better in order to really understand a planet’s environmental history and to be more certain in how an individual feature formed.

In addition to identifying 18 new fluvial ridges, using the mosaic image allowed the team to re-examine features that had previously been identified as fluvial ridges. Upon closer inspection, some weren’t formed by rivers after all, but rather lava flows or glaciers. “If you only see a small part of [a ridge], you might have an idea of how it formed,” Dickson says. “But then you see it in a larger context — like, oh, it’s the flank of a volcano, it’s a lava flow. So now we can more confidently determine which are fluvial ridges, versus ridges formed by other processes.”

Now that we have a global understanding of the distribution of ancient rivers on Mars, future explorations — whether by rover or by astronauts — could use these rock records to investigate what past climates and tectonics were like. “One of the biggest breakthroughs in the last twenty years is the recognition that Mars has a sedimentary record, which means we’re not limited to studying the planet today,” Fischer says. “We can ask questions about its history.” And in doing so, he says, we learn not only about a single planet’s past, but also find “truths about how planets evolved… and why the Earth is habitable.”

As this study is only the first to use the full mosaic, Dickson looks forward to seeing how it gets put to use next. “We expect to see more and more studies, similar in scale to what we’re doing here, by other researchers around the world,” he says. “We hope that this ‘maiden voyage’ scientific study sets an example for the scale of science that can be done with a product this big.”

Persistence of flare-driven atmospheric chemistry on rocky habitable zone worlds

by Howard Chen, Zhuchang Zhan, Allison Youngblood, Eric T. Wolf, Adina D. Feinstein, Daniel E. Horton in Nature Astronomy

Although violent and unpredictable, stellar flares emitted by a planet’s host star do not necessarily prevent life from forming, according to a new Northwestern University study.

Emitted by stars, stellar flares are sudden flashes of magnetic imagery. On Earth, the sun’s flares sometimes damage satellites and disrupt radio communications. Elsewhere in the universe, robust stellar flares also have the ability to deplete and destroy atmospheric gases, such as ozone. Without the ozone, harmful levels of ultraviolet (UV) radiation can penetrate a planet’s atmosphere, thereby diminishing its chances of harboring surface life.

By combining 3D atmospheric chemistry and climate modeling with observed flare data from distant stars, a Northwestern-led team discovered that stellar flares could play an important role in the long-term evolution of a planet’s atmosphere and habitability.

“We compared the atmospheric chemistry of planets experiencing frequent flares with planets experiencing no flares. The long-term atmospheric chemistry is very different,” said Northwestern’s Howard Chen, the study’s first author. “Continuous flares actually drive a planet’s atmospheric composition into a new chemical equilibrium.”

“We’ve found that stellar flares might not preclude the existence of life,” added Daniel Horton, the study’s senior author. “In some cases, flaring doesn’t erode all of the atmospheric ozone. Surface life might still have a fighting chance.”

The study is a joint effort among researchers at Northwestern, University of Colorado at Boulder, University of Chicago, Massachusetts Institute of Technology and NASA Nexus for Exoplanet System Science (NExSS).

Horton is an assistant professor of Earth and planetary sciences in Northwestern’s Weinberg College of Arts and Sciences. Chen is a Ph.D. candidate in Horton’s Climate Change Research Group and a NASA future investigator.

Importance of flares

All stars — including our very own sun — flare, or randomly release stored energy. Fortunately for Earthlings, the sun’s flares typically have a minimal impact on the planet.

“Our sun is more of a gentle giant,” said Allison Youngblood, an astronomer at the University of Colorado and co-author of the study. “It’s older and not as active as younger and smaller stars. Earth also has a strong magnetic field, which deflects the sun’s damaging winds.”

Unfortunately, most potentially habitable exoplanets aren’t as lucky. For planets to potentially harbor life, they must be close enough to a star that their water won’t freeze — but not so close that water vaporizes.

“We studied planets orbiting within the habitable zones of M and K dwarf stars — the most common stars in the universe,” Horton said. “Habitable zones around these stars are narrower because the stars are smaller and less powerful than stars like our sun. On the flip side, M and K dwarf stars are thought to have more frequent flaring activity than our sun, and their tidally locked planets are unlikely to have magnetic fields helping deflect their stellar winds.”

Chen and Horton previously conducted a study of M dwarf stellar systems’ long term climate averages. Flares, however, occur on an hours- or days-long timescales. Although these brief timescales can be difficult to simulate, incorporating the effects of flares is important to forming a more complete picture of exoplanet atmospheres. The researchers accomplished this by incorporating flare data from NASA’s Transiting Exoplanet Satellite Survey, launched in 2018, into their model simulations.

Using flares to detect life

If there is life on these M and K dwarf exoplanets, previous work hypothesizes that stellar flares might make it easier to detect. For example, stellar flares can increase the abundance of life-indicating gasses (such as nitrogen dioxide, nitrous oxide and nitric acid) from imperceptible to detectable levels.

“Space weather events are typically viewed as a detriment to habitability,” Chen said. “But our study quantitatively shows that some space weather can actually help us detect signatures of important gases that might signify biological processes.”

This study involved researchers from a wide range of backgrounds and expertise, including climate scientists, exoplanet scientists, astronomers, theorists and observers.

“This project was a result of fantastic collective team effort,” said Eric T. Wolf, a planetary scientist at CU Boulder and a co-author of the study. “Our work highlights the benefits of interdisciplinary efforts when investigating conditions on extrasolar planets.”

Green Bank and Effelsberg Radio Telescope Searches for Axion Dark Matter Conversion in Neutron Star Magnetospheres

by Joshua W. Foster, Yonatan Kahn, Oscar Macias, Zhiquan Sun, Ralph P. Eatough, Vladislav I. Kondratiev, Wendy M. Peters, Christoph Weniger, Benjamin R. Safdi in Physical Review Letters

In the 1970s, physicists uncovered a problem with the Standard Model of particle physics — the theory that describes three of the four fundamental forces of nature (electromagnetic, weak, and strong interactions; the fourth is gravity). They found that, while the theory predicts that a symmetry between particles and forces in our Universe and a mirror version should be broken, the experiments say otherwise. This mismatch between theory and observations is dubbed “the Strong CP problem” — CP stands for Charge+Parity. What is the CP problem, and why has it puzzled scientists for almost half a century?

In the Standard Model, electromagnetism is symmetric under C (charge conjugation), which replaces particles with antiparticles; P (parity), which replaces all the particles with their mirror image counterparts; and, T (time reversal), which replaces interactions going forwards in time with ones going backwards in time, as well as combinations of the symmetry operations CP, CT, PT, and CPT. This means that experiments sensible to the electromagnetic interaction should not be able to distinguish the original systems from the ones that have been transformed by either of the aforementioned symmetry operations.

In the case of the electromagnetic interaction, the theory matches the observations very well. As anticipated, the problem lays in one of the two nuclear forces — “the strong interaction.” As it turns out, the theory allows violations of the combined symmetry operation CP (reflecting particles in a mirror and then changing particle for antiparticle) for both the weak and strong interaction. However, CP violations have so far been only observed for the weak interaction.

More specifically, for the weak interactions, CP violation occurs at approximately the 1-in-1,000 level, and many scientists expected a similar level of violations for the strong interactions. Yet experimentalists have looked for CP violation extensively but to no avail. If it does occur in the strong interaction, it’s suppressed by more than a factor of one billion (1⁰⁹).

In 1977, theoretical physicists Roberto Peccei and Helen Quinn proposed a possible solution: they hypothesized a new symmetry that suppresses CP-violating terms in the strong interaction, thus making the theory match the observations. Shortly after, Steven Weinberg and Frank Wilczek — both of whom went on to win the Nobel Prize in physics in 1979 and 2004, respectively — realized that this mechanism creates an entirely new particle. Wilczek ultimately dubbed this new particle the “axion,” after a popular dish detergent with the same name, for its ability to “clean up” the strong CP problem.

The axion should be an extremely light particle, be extraordinarily abundant in number, and have no charge. Due to these characteristics, axions are excellent dark matter candidates. Dark matter makes up about 85 percent of the mass content of the Universe, but its fundamental nature remains one of the biggest mysteries of modern science. Finding that dark matter is made of axions would be one of the greatest discoveries of modern science.

In 1983, theoretical physicist Pierre Sikivie found that axions have another remarkable property: In the presence of an electromagnetic field, they should sometimes spontaneously convert to easily detectable photons. What was once thought to be completely undetectable, turned out to be potentially detectable as long as there is high enough concentration of axions and strong magnetic fields.

Some of the Universe’s strongest magnetic fields surround neutron stars. Since these objects are also very massive, they could also attract copious numbers of axion dark matter particles. So physicists have proposed searching for axion signals in the surrounding regions of neutron stars. Now, an international research team, including the Kavli Institute for the Physics and Mathematics of the Universe (Kavli IPMU) postdoc Oscar Macias, has done exactly that with two radio telescopes — the Robert C. Byrd Green Bank Telescope in the US, and the Effelsberg 100-m Radio Telescope in Germany.

The targets of this search were two nearby neutron stars known to have strong magnetic fields, as well as the Milky Way’s center, which is estimated to host half a billion neutron stars. The team sampled radio frequencies in the 1-GHz range, corresponding to axion masses of 5–11 micro electron-volt. Since no signal was seen, the team was able to impose the strongest limits to date on axion dark matter particles of a few micro electron-volt mass.

The 95% upper limits on the signal flux for the indicated sources from the GBT and Effelsberg observations. These curves have been down-sampled for visualization purposes. The authors compare these limits with the 95% upper limits expected from the ideal radiometer equation under the assumption that the only source of statistical uncertainty is thermal noise at the total system temperature.

Multimessenger constraints on the neutron-star equation of state and the Hubble constant

by Tim Dietrich, Michael W. Coughlin, Peter T. H. Pang, Mattia Bulla, Jack Heinzel, Lina Issa, Ingo Tews, Sarah Antier in Science

A combination of astrophysical measurements has allowed researchers to put new constraints on the radius of a typical neutron star and provide a novel calculation of the Hubble constant that indicates the rate at which the universe is expanding.

“We studied signals that came from various sources, for example recently observed mergers of neutron stars,” said Ingo Tews, a theorist in Nuclear and Particle Physics, Astrophysics and Cosmology group at Los Alamos National Laboratory, who worked with an international collaboration of researchers on the analysis to appear in the journal Science. “We jointly analyzed gravitational-wave signals and electromagnetic emissions from the mergers, and combined them with previous mass measurements of pulsars or recent results from NASA’s Neutron Star Interior Composition Explorer. We find that the radius of a typical neutron star is about 11.75 kilometers and the Hubble constant is approximately 66.2 kilometers per second per megaparsec.”

Combining signals to gain insight into distant astrophysical phenomena is known in the field as multi-messenger astronomy. In this case, the researchers’ multi-messenger analysis allowed them to restrict the uncertainty of their estimate of neutron star radii to within 800 meters.

Their novel approach to measuring the Hubble constant contributes to a debate that has arisen from other, competing determinations of the universe’s expansion. Measurements based on observations of exploding stars known as supernovae are currently at odds with those that come from looking at the Cosmic Microwave Background (CMB), which is essentially the left over energy from the Big Bang. The uncertainties in the new multimessenger Hubble calculation are too large to definitively resolve the disagreement, but the measurement is slightly more supportive of the CMB approach.

Tews’ primary scientific role in the study was to provide the input from nuclear theory calculations that are the starting point of the analysis. His seven collaborators on the paper comprise an international team of scientists from Germany, the Netherlands, Sweden, France, and the United States.

SuperRAENN: A semi-supervised supernova photometric classification pipeline trained on Pan-STARRS1 Medium Deep Survey supernovae

by V. Ashley Villar, Griffin Hosseinzadeh, Edo Berger, Michelle Ntampaka, David O. Jones, Peter Challis, Ryan Chornock, Maria R. Drout, et al. in The Astrophysical Journal

&

Photometric Classification of 2315 Pan-STARRS1 Supernovae with Superphot

by Griffin Hosseinzadeh, Frederick Dauphin, V. Ashley Villar, Edo Berger, David O. Jones, Peter Challis, Ryan Chornock, Maria R. Drout, Ryan J. Foley, et al. in The Astrophysical Journa

Artificial intelligence is classifying real supernova explosions without the traditional use of spectra, thanks to a team of astronomers at the Center for Astrophysics | Harvard & Smithsonian. The complete data sets and resulting classifications are publicly available for open use.

By training a machine learning model to categorize supernovae based on their visible characteristics, the astronomers were able to classify real data from the Pan-STARRS1 Medium Deep Survey for 2,315 supernovae with an accuracy rate of 82-percent without the use of spectra.

The astronomers developed a software program that classifies different types of supernovae based on their light curves, or how their brightness changes over time. “We have approximately 2,500 supernovae with light curves from the Pan-STARRS1 Medium Deep Survey, and of those, 500 supernovae with spectra that can be used for classification,” said Griffin Hosseinzadeh, a postdoctoral researcher at the CfA and lead author on the first of two papers. “We trained the classifier using those 500 supernovae to classify the remaining supernovae where we were not able to observe the spectrum.”

Edo Berger, an astronomer at the CfA explained that by asking the artificial intelligence to answer specific questions, the results become increasingly more accurate. “The machine learning looks for a correlation with the original 500 spectroscopic labels. We ask it to compare the supernovae in different categories: color, rate of evolution, or brightness. By feeding it real existing knowledge, it leads to the highest accuracy, between 80- and 90-percent.”

Although this is not the first machine learning project for supernovae classification, it is the first time that astronomers have had access to a real data set large enough to train an artificial intelligence-based supernovae classifier, making it possible to create machine learning algorithms without the use of simulations.

“If you make a simulated light curve, it means you are making an assumption about what supernovae will look like, and your classifier will then learn those assumptions as well,” said Hosseinzadeh. “Nature will always throw some additional complications in that you did not account for, meaning that your classifier will not do as well on real data as it did on simulated data. Because we used real data to train our classifiers, it means our measured accuracy is probably more representative of how our classifiers will perform on other surveys.” As the classifier categorizes the supernovae, said Berger, “We will be able to study them both in retrospect and in real-time to pick out the most interesting events for detailed follow up. We will use the algorithm to help us pick out the needles and also to look at the haystack.”

The project has implications not only for archival data, but also for data that will be collected by future telescopes. The Vera C. Rubin Observatory is expected to go online in 2023, and will lead to the discovery of millions of new supernovae each year. This presents both opportunities and challenges for astrophysicists, where limited telescope time leads to limited spectral classifications.

“When the Rubin Observatory goes online it will increase our discovery rate of supernovae by 100-fold, but our spectroscopic resources will not increase,” said Ashley Villar, a Simons Junior Fellow at Columbia University and lead author on the second of the two papers, adding that while roughly 10,000 supernovae are currently discovered each year, scientists only take spectra of about 10-percent of those objects. “If this holds true, it means that only 0.1-percent of supernovae discovered by the Rubin Observatory each year will get a spectroscopic label. The remaining 99.9-percent of data will be unusable without methods like ours.”

Unlike past efforts, where data sets and classifications have been available to only a limited number of astronomers, the data sets from the new machine learning algorithm will be made publicly available. The astronomers have created easy-to-use, accessible software, and also released all of the data from Pan-STARRS1 Medium Deep Survey along with the new classifications for use in other projects. Hosseinzadeh said, “It was really important to us that these projects be useful for the entire supernova community, not just for our group. There are so many projects that can be done with these data that we could never do them all ourselves.” Berger added, “These projects are open data for open science.”

The Abell 3391/95 galaxy cluster system. A 15 Mpc intergalactic medium emission filament, a warm gas bridge, infalling matter clumps, and (re-) accelerated plasma discovered by combining SRG/eROSITA data with ASKAP/EMU and DECam data

by T.H. Reiprich, A. Veronica, F. Pacaud, M.E. Ramos-Ceja, N. Ota, J. Sanders, M. Kara, T. Erben in Astronomy & Astrophysics

More than half of the matter in our universe has so far remained hidden from us. However, astrophysicists had a hunch where it might be: In so-called filaments, unfathomably large thread-like structures of hot gas that surround and connect galaxies and galaxy clusters. A team led by the University of Bonn (Germany) has now for the first time observed a gas filament with a length of 50 million light years. Its structure is strikingly similar to the predictions of computer simulations. The observation therefore also confirms our ideas about the origin and evolution of our universe.

We owe our existence to a tiny aberration. Pretty much exactly 13.8 billion years ago, the Big Bang occurred. It is the beginning of space and time, but also of all matter that makes up our universe today. Although it was initially concentrated at one point, it expanded at breakneck speed — a gigantic gas cloud in which matter was almost uniformly distributed.

Almost, but not completely: In some parts the cloud was a bit denser than in others. And for this reason alone there are planets, stars and galaxies today. This is because the denser areas exerted slightly higher gravitational forces, which drew the gas from their surroundings towards them. More and more matter therefore concentrated at these regions over time. The space between them, however, became emptier and emptier. Over the course of a good 13 billion years, a kind of sponge structure developed: large “holes” without any matter, with areas in between where thousands of galaxies are gathered in a small space, so-called galaxy clusters.

Fine web of gas threads

If it really happened that way, the galaxies and clusters should still be connected by remnants of this gas, like the gossamer-thin threads of a spider web. “According to calculations, more than half of all baryonic matter in our universe is contained in these filaments — this is the form of matter of which stars and planets are composed, as are we ourselves,” explains Prof. Dr. Thomas Reiprich from the Argelander Institute for Astronomy at the University of Bonn. Yet it has so far escaped our gaze: Due to the enormous expansion of the filaments, the matter in them is extremely diluted: It contains just ten particles per cubic meter, which is much less than the best vacuum we can create on Earth.

However, with a new measuring instrument, the eROSITA space telescope, Reiprich and his colleagues were now able to make the gas fully visible for the first time. “eROSITA has very sensitive detectors for the type of X-ray radiation that emanates from the gas in filaments,” explains Reiprich. “It also has a large field of view — like a wide-angle lens, it captures a relatively large part of the sky in a single measurement, and at a very high resolution.” This allows detailed images of such huge objects as filaments to be taken in a comparatively short time.

Confirmation of the standard model

In their study, the researchers examined a celestial object called Abell 3391/95. This is a system of three galaxy clusters, which is about 700 million light years away from us. The eROSITA images show not only the clusters and numerous individual galaxies, but also the gas filaments connecting these structures. The entire filament is 50 million light years long. But it may be even more enormous: The scientists assume that the images only show a section.

“We compared our observations with the results of a simulation that reconstructs the evolution of the universe,” explains Reiprich. “The eROSITA images are strikingly similar to computer-generated graphics. This suggests that the widely accepted standard model for the evolution of the universe is correct.” Most importantly, the data show that the missing matter is probably actually hidden in the filaments.

Upcoming Events

JAN 18 AIxSPACE

JAN 19 APSCC 2021 Webinar Series

JAN 26 Closing the Gap Between Remote Sensing Capabilities and Customer Requirements Webinar

JAN 27 Mobile Deployable Communications Conference 2021

FEB 2 APSCC 2021 Webinar Series

MISC

Subscribe to Paradigm!

Medium. Twitter. Telegram. Telegram Chat. Reddit. LinkedIn.

Main sources

Research articles

Nature Astronomy

The Astrophysical Journal

Science Daily

Space News

--

--