Space needs a hall monitor
Every unruly environment needs someone to keep tabs on all the comings and goings, like the hall monitors who are ubiquitous in schools. But while the corridors and movements inside a school are bounded and relatively easy to chart, the great hall of space — in its cosmic vastness, large numbers of objects, and trajectory uncertainties — is another thing altogether.
Sensors — devices that detect and measure inputs from the physical environment and convert them to machine- or human-readable data — are vital for this task. Sensors can sense different things, depending on the application. In missile defense, for example, a sensor-equipped satellite might look for heat signatures that are characteristic of missiles and other rockets. In that case, we would want to know where the missiles originated and where they are headed.
Another thing we can sense are radio frequency signals. These signals could come from the object we’re trying to locate itself — say, to help us determine whether a satellite is still actively working. Or the signals could originate from another transmitter, and happen to bounce off the thing we’re interested in; we call these “signals of opportunity.”
Space domain awareness (SDA) is the ability to monitor our increasingly cluttered space environment. The first objective is to know the orbits of everything in space. While satellites generally tend to stay in the same orbit, little “nudges” from atmospheric drag, the Earth’s bulginess, the Moon’s gravitational pull, and other perturbations cause a satellite’s orbit to change over time. A satellite’s orbit also can change more significantly if it uses its thrusters to perform an orbital maneuver. Thus, we have to take frequent measurements of satellites to make sure our orbit estimates are good and to prevent collisions between satellites.
SDA sensing has been performed primarily from the ground, using radar and optical telescopes. There is renewed interest in shifting some of this sensing to space-based sensors, particularly for tracking satellites farther from the Earth, in what we call cislunar space. This area of space stretches from geosynchronous orbit (GEO) — at an altitude of 35,786 km — to beyond the Moon, which is some 384,400 km from the Earth (on average; the Moon’s orbit is not a perfect circle).
Using space-based sensors for monitoring satellites in cislunar space has many advantages over employing ground-based sensors. These benefits include shorter distances between the sensor and the tracked satellite (enabling better resolution); different viewing geometry (seeing things from different angles helps us pinpoint where they are); and greater opportunities for sensing (sensing opportunities from the ground are very sparse due to the geometry of solar illumination).
Autonomy increasingly is being explored as an enhanced capability for these space sensor missions. Autonomous sensor-equipped aerospace vehicles can make decisions about what measurements and information to collect, as well as where to collect it, without human intervention. While a human reasonably can task a single sensor or handful of sensors, it becomes more and more difficult — realistically, all but impossible — to coordinate a large network of sensors without some level of sensor autonomy.
For example, take the enormously complex task of broadcasting a live sporting event. Managing the multiple cameras and assembling the feed requires dozens if not hundreds of people. Now imagine that we’re interested not only in what is happening on the field or court, but also in what is happening in the neighborhood, city, region or country.
Merging information is key to sensor autonomy. My research group is working on developing estimation algorithms that fuse information from a variety of sources, including conventional sensors (cameras and radar), as well as language inputs from humans. The next generation of sensors will be enabled by combining information across sensors/vehicles. This is called multi-sensor fusion.
While the basic premise of multi-sensor fusion has been around for decades, its full potential has yet to be realized due to numerous significant practical challenges related to computational complexity and communication bandwidth limitations. These obstacles are very multidisciplinary, spanning multiple research areas. In this respect, we benefit from affiliation with Purdue’s Institute for Control, Optimization and Networks (ICON), which brings in faculty from other Purdue schools, as well as outside experts, for interdisciplinary collaborations.
While our research often is motivated by aerospace applications, its relevance goes far beyond aerospace. If you strip away the context of the problem to reach the bare mathematics, you’ll find the same research questions in robotics and even finance. The regular information exchange among ICON faculty members provides an excellent opportunity to recognize those commonalities and broaden our research’s impact to transcend the aerospace community.
Keith LeGrand, PhD
Assistant Professor, School of Aeronautics and Astronautics
Director, Sensing, Controls, and Probabilistic Estimation (SCOPE) Group
Member, Purdue Engineering Initiative in Cislunar Space (Cislunar Initiative)
Faculty Contributor, Institute for Control, Optimization and Networks (ICON)
College of Engineering
Profile of Professor Keith LeGrand: Information value and satellite automation
2019 IEEE Aerospace Conference: ‘Survey of challenges in labeled random finite set distributed multi-sensor multi-object tracking’ (free access to abstract)
IEEE Transactions on Pattern Analysis and Machine Intelligence: ‘Cell Multi-Bernoulli (Cell-MB) sensor control for multi-object search-while-tracking (SWT)’ (open access)