Introduction to Sensor Arrays

How to enhance data with multiple sensors

Isaac Berrios
9 min readJul 2, 2024

In this post, we introduce Sensor Arrays and discuss their benefits. We will walk through a Python example of a Uniform Linear Array and show how to compute its Array Patterns for different angles. We also introduce the Steering Vector, which is crucial for working with arrays.

Photo by Trophim Lapteff on Unsplash

Motivation

We use sensors to collect information, learn, and ultimately make decisions in our environment; the quality of these decisions are limited to the quality of the sensor observations. On a boat, a sonar transducer will tell us the depth of underwater objects, figure 1 shows basic Sonar operation. The sonar continuously acquires a 1D depth vector and uses it to update the depth map. It does this with a 3D beam, but it doesn’t provide any 3D info; it only provides the depth at the current location.

Figure 1. Example of a Sonar on a boat creating a depth map of its underwater environment. Source.

In order to obtain 3D info, such as the surface underneath the boat we need to make measurements that are able to obtain both depth and angle.

Figure 2. 3D Sonar scanning. Source.

We can do this with an array of Sonar Transducers, the individual transducers provide a single depth value and we can use all of these depth values to produce a vector of depth values for all angles in a region (usually the region of beam coverage as in figure 2). The sonar array provides us with values of depth and angle over time as opposed to just depth over time. Now we have a whole new dimension of valuable data to work with.

Sensor Arrays

In general, we can use sensor arrays to obtain better observations of the environment, and these observations will almost always be better than that of a single sensor element. The key benefit of an array is that it allows us to accurately estimate the direction (angle) of an incoming signal. Sensor arrays used in:

  • Radar
  • Sonar
  • Audio/Acoustics
  • Radio Astronomy
  • Seismology
  • Wireless Communications

Each topic listed above poses a unique challenge to the design of an array, but they all in some way deal with sinusoidal signals and plane waves. This post is aimed at introducing arrays as a general topic and we will start with the most simple array, the Uniform Linear Array.

Uniform Linear Array

Sensor arrays come in all shapes and sizes, but in this post we will only consider the Uniform Linear Array (ULA). It consists of elements equally spaced across a straight line, and receives signals in the form of plane waves, these signals are sinusoids that have a frequency and vector direction in 3D space. It is important to know that plane waves (may) impinge on the different array elements at different times due to the spacing, this is the key to estimating the Angle of Arrival (AoA). In the image below the plane wave is approaching from angle θ, we define θ = 0° to be head on or broadside, and θ = ±90° to be the edges or endfire.

Figure 3. Four element Uniform Linear Array with an impinging plane wave at angle θ. Source.

For this post, we will work with a ULA and make the following assumptions to simplify things for some Python examples.

  • The distance between each element d is half the wavelength λ of the signals of interest
  • Each element is isotropic (radiates equally in all directions).
  • The array Field of View is limited to θ = ±90°
  • The array only transmits and receives signals at a certain frequency (Narrowband assumption)

Beam Pattern

The Beam Pattern (or Array Pattern) describes how the array responds to signals coming from different directions. The Steering Vector allows us to steer the array and change it’s response to signals coming from a desired direction. We can also use the steering vector to steer the Beam Pattern, see figure 6 below for an example of the Steering Vector in action.

Steering Vector

The steering vector describes the array response to a signal in a particular direction, whether the signal is transmitted or received the steering vector will be the same. More formally, the steering vector gives us a complex phasor representation of the incident signal, it is comprised of the two vectors below

The element position vector describes the position of each array element, and the wavevector describes the direction of the incident signal at a given frequency. We denote the steering vector as v(k).

The term wavevector can be confusing, for simplicity you can think of it as a vector that describes the direction of a signal at a certain frequency

The Array Manifold is a matrix of Steering Vectors for all possible angles

Let’s derive the Steering Vector for our ULA, we will focus on a single element m.

The resulting expression provides a way to steer the ULA with a single angle value θ, the length of the steering vector is the same as the number of elements. This is because we need to tell each element how to behave in order to steer in the direction that we want.

Each component describes the array response of a certain element at a particular angle, if we have a head-on (broadside) plane wave signal, then we would expect all elements of the array to behave the same since the signal would impinge on them all at the same time. However, if we have a signal off to the side, then we expect the array elements to have different responses, since the signal impinges on each element differently, the signal will be received at each element with a different time delay. This time delay can be modeled as a phase shift across the elements. When we say the Steering Vector tells the array elements how to behave, we are really just telling them how much to shift their phases in order to steer where we want.

Code the Steering Vector

To get a better understanding, let’s code an example, first we will make a special function to compute the Steering Vector, then we will compute the array response across θ = ±90°. The array response is the inverse of the variance across all array elements. A signal with the highest possible array response would have arrival angle: θ = 0°, this would theoretically produce zero variance across all the elements since they would all receive the signal simultaneously and in phase.

def compute_steering_vector(num_ant=8, angle_res=1, angle_rng=90):
""" Computes array of Steering Vectors for a desired angluar range
and resolution. **This is a special function that only computes the
steering vectors along a 1D linear axis.**
Inputs:
angle_res - angle resolution in degrees
angle_rng - single sided angle range
num_ant - number of virtual antennas
Output:
steering_vectors
"""
# get number of steering vectors based on desired angle range and resolution
num_vec = (2 * angle_rng / angle_res + 1)
num_vec = int(round(num_vec))

# convert to radians
angle_rng = angle_rng*np.pi/180
angle_res = angle_res*np.pi/180
# compute steering vectors
steering_vectors = np.zeros((num_vec, num_ant), dtype=np.complex64)
for k in range(num_vec):
for m in range(num_ant):
steering_vectors[k, m] = np.exp(-1j*np.pi*m
*np.sin(-angle_rng + k*angle_res))

return steering_vectors


# Get steering vectors for all angles
array_response = compute_steering_vector(num_ant=8, angle_res=1, angle_rng=90)

# compute power of array response
response_pwr = np.var(array_response, axis=1)
response_pwr += steer_pwr.max() # normalize
plt.plot(np.linspace(-90, 90, len(response_pwr)), -10*np.log10(steer_pwr))
plt.title("Normalized Array Response")
plt.xlabel("Azimuth Angle")
plt.ylabel("Normalized Power");
Figure 4. Normalized Array response. Source: Author

The plot above shows the array response at each (steering) angle for an 8 element array. Notice how we have a high response for angles close to zero and how quickly it dips off to the side. This means that we will loose accuracy when we estimate desired properties of signals that are far away from θ = 0°. The array response gives us a general idea of what type of angular accuracy to expect, but this doesn’t tell us the whole story. To really assess performance, we want to look at how steering the array impacts performance across different angles. Now we will introduce the Beam Pattern.

Steering the Beam Pattern

The Beam Pattern represents the gain of the array at a certain steering angle and frequency. It indicates the directional selectivity of the array, in other words this tells use how accurate we can estimate the Angle of Arrival. An array with more elements produces a more narrow response, and a more narrow response means better angular accuracy. We can conclude that more elements means better angular accuracy. The best part about this is that these benefits apply to any application of arrays, whether it be Radar, Sonar, or Seismology.

We can compute the Beam Pattern by taking the inner product of each Array Response vector with the Hermitian of the Steering Vector for the desired direction θₛ, and we do this for all angles θ.

Let’s go ahead and compute the Beam Pattern in Python:

M = 20 # number of antennas
theta = 10 # steering angle in degrees
angle_res = 0.1 # angle resolution in degrees

# array element vector
r = np.arange(0, M)

# compute steering vectors for all angles (array manifold)
array_manifold = compute_steering_vector(num_ant=M, angle_res=angle_res, angle_rng=90)
steering_vector = np.exp(-1j*np.pi*r*np.sin(theta*np.pi/180))

num_angles = array_manifold.shape[0]
angle_span = np.linspace(-90, 90, num_angles)

# Beam Pattern
pattern = np.conjugate(array_manifold) @ steering_vector[:, None]
Figure 5. 20 element array Beam Pattern at 25°. Source: Author.

We can see the largest peak is steered to 25°, we call this peak the main lobe of the array, while the smaller lobes are called side-lobes. We generally want a narrow main lobe and low side-lobes, but it turns out that getting low side-lobes increases the width of the main lobe and vica versa. So we need to make design decisions to find an acceptable trade off.

Now let’s steer the Beam Pattern across all angles, this time we will use 80 elements (I had to reduce the range to ±80° so that the GIF will upload).

M = 80 # number of antennas
theta = 40 # steering angle in degrees
angle_res = 0.25 # angle resolution in degrees

# array element vector
r = np.arange(0, M)

array_manifold = compute_steering_vector(num_ant=M, angle_res=angle_res, angle_rng=80)
steering_vector = np.exp(-1j*np.pi*r*np.sin(theta*np.pi/180))

num_angles = array_manifold.shape[0]
angle_span = np.linspace(-80, 80, num_angles)


thetas = np.arange(-80, 80+angle_res, angle_res)

array_patterns = []
for _theta in thetas:
steering_vector = np.exp(-1j*np.pi*r*np.sin(_theta*np.pi/180))
pattern = np.conjugate(array_manifold) @ steering_vector[:, None]

array_patterns.append(pattern)
Figure 6. 80 elements Beam Patterns sweeped to ±90°. Source: Author.

Notice that the main lobe is higher and more narrow, adding more elements increases the gain and directivity of the array, which in turn allows us to make more accurate measurements. We also get more side-lobes when we have more elements. Another thing to notice is that the main beam gets wider and wider as it goes out to the edges. This means that our angular accuracy decreases for signals impinging the edges of the array.

Next Steps

In this post we introduced the concept of Sensor Arrays and derived the Steering Vector for a Uniform Linear Array. We then visualized the array response and Beam patterns for multiple angles.

Key Take Aways

  • Sensor Array — An arrangement of sensor elements
  • Steering Vector — A complex phasor that describes the Array Response to a signal in a particular direction
  • Beam Pattern — The gain of the array across it’s Field of View
  • Directivity — Represents the angular selectivity of an array
    - More elements provides a higher directivity
    - Higher directivity means higher angular accuracy
  • The Beam Pattern can be steered (tuned) to different angles with the Steering Vector
  • The main lobe of the Beam Pattern looses directivity as we steer towards the edges of the Field of View

This was a high level introduction, if you’re still interested in arrays feel free to explore some more:

References

--

--

Isaac Berrios

Electrical Engineer interested in Sensor Fusion, Computer Vision, and Autonomy