ML in seismic processing

Acquisition footprint attenuation

Fully automated approach using Deep Learning

Antonina Arefina
6 min readOct 11, 2022

--

This post continues our series on applying ML technologies to Oil&Gas Industry tasks, particularly, processing and interpreting seismic data.

Our major goal is to let expert seismologists get rid of routine jobs. Today, we want to talk about one more step towards it. We developed an ML-driven algorithm for acquisition footprint attenuation which is one of the final subtasks of a seismic processing pipeline.

Acquisition footprint is a type of noise that can seriously complicate interpretation of seismic data. Therefore, seismologists make efforts to attenuate it.

The existing methods for reducing it require manual tuning of parameters, mostly in a trial manner. Our algorithm attenuates the acquisition footprint much faster than existing semi-automatic methods and without human intervention.

What is the acquisition footprint?

Seismic cubes (or seismic volumes) are regular 3D arrays that represent subsurface structure with X and Y lateral dimensions and depth dimension. The raw data is obtained by subsequently generating an artificial seismic wave from a large number of different locations and recording the wave by a network of receivers. The final volume is the result of multiple complex procedures for processing and aggregating the raw data.

Insufficient density of signal sources and receivers during seismic acquisition and irregularities in their positions may result in grid-like noise in the resulting cube that is called acquisition footprint.

Acquisition footprint on a depth slice
Acquisition footprint on a depth slice

Semi-automatic footprint removal

Existing footprint attenuation methods are based on filtering in the wavenumber domain. Footprint manifests in spikes on a 2D Fourier transform along lateral dimensions. Summing along the depth dimension makes them even more bright.

Wavenumber domain filtering approach to acquisition footprints attenuation

One common approach to denoising is to apply a wavenumber 2D filter to suppress these spikes.

Sometimes, reasonable filtering parameters can be derived from acquisition geometry features. Then, the method works fast and doesn’t require additional efforts.

Wavenumber domain filtering drawbacks

Manual parameter tuning

Unfortunately, in many cases, default filtering parameters produce poor results. For example, it happens when field data are a union of different surveys, or due to other irregularities in acquisition geometry.

Various footprint grids configurations in wavenumber and spatial domains

Such cases require manual parameter tuning and sometimes even several filtering passes.

Signal weakening

Another case to be treated with caution is when a cube contains seismic events that are parallel to the footprint grid. It is a challenging task to design a filter that removes footprint but doesn’t alter such events.

Acquisition footprint collinear to a seismic event

Automatic footprint attenuation

Our team created a procedure that performs denoising automatically. It gets a seismic cube as an input and outputs the denoised cube. The core of the method is a neural network that we trained to attenuate acquisition footprint in small crops. We made the network adapt to different footprint grid configurations. As a result, the method doesn’t require any tuning.

Here are some key ideas about building such an algorithm.

Training data

For training, we selected 3 cubes from historical data that had both original cubes that contained acquisition footprint and their denoised versions of good quality, as assessed by expert seismologists. The major seismic ML concern is the quality and variety of data, so we used cubes from different fields and the targets were built by different experts using different methods.

One feature of the acquisition footprint is that it is more intense in shallow slices, while it can be negligible in larger depths.

We addressed this feature by discarding very shallow slices where it was impossible to restore the useful signal, and very deep slices where both noise and signal were faint.

A whole seismic cube or even depth slice are too large for NN models, so we used 2D crops from depth slices of size 128x128. For training, we generated them at random positions using `SeismicSampler` implemented in our SeismiQB library.

Model Architecture

We used an hourglass architecture model with ResBlocks. Here is the model’s config in batchflow format, which is pretty self-explanatory.

Model config in the batchflow format

Augmentations

We wanted to build a model that could adapt to various footprint configurations, but we had a limited number of cubes that were suitable for training, and even less that had complex footprint configurations. Naturally, we decided to use augmentations.

We used shape scaling, which is a rather common technique. Also, we generated an additional synthetic footprint in the input crops. This allowed us to emulate complex footprint grids that we observed in real projects.

Generating synthetic footprint

Generating additional synthetic noise turned out to be very useful in this model and significantly boosted the model performance.

Inference

While we generate crops with random locations at train time, at inference, we build a regular grid of crop origins. To smooth differences at crops borders and make the whole denoising process more robust, we use strides ⅓ of crop size in both lateral coordinates. We aggregate overlapping points in the resulting cube using the mean amplitude.

The amplitudes in an input cube can have an almost arbitrary range, so we use input crop scaling to ensure that most of the samples lie inside (-1, 1) range. Inverse scaling is performed on the output cube after aggregation to get the original amplitudes range.

Validation

We validated the acquisition footprint attenuation method on several real seismic cubes that were not included in the NN model’s training dataset. Our algorithm shows comparable or even better denoising than in reference historical data from different fields without any tuning. And it works much faster. For example, one large (2030.6 km²) cube with a complex footprint is processed in 40 minutes, while manual denoising of the same cube took 3 days.

Our method results compared to reference historical data

Summary

In this post, we described an NN model for attenuating the acquisition footprint in seismic data. This model can be used to remove noise automatically and much faster than existing approaches that need parameter tuning.

We implemented the model and the denoising algorithm using our open-source libraries: batchflow for building ML models and pipelines and SeismiQB for working with seismic cubes.

Our team works hard to create fully automated procedures for all seismic processing and interpretation subtasks. Have a look at our articles about first break picking, horizon detection and many others. In future, we plan to cover the entire pipeline of seismic processing. Stay tuned!

--

--