Member-only story

Delayed Message Passing

Dynamically Rewired Delayed Message Passing GNNs

Message-passing graph neural networks (MPNNs) tend to suffer from the phenomenon of over-squashing, causing performance deterioration for tasks relying on long-range interactions. This can be largely attributed to message passing only occurring locally, over a node’s immediate neighbours. Traditional static graph rewiring techniques typically attempt to counter this effect by allowing distant nodes to communicate instantly (and in the extreme case of Transformers, by making all nodes accessible at every layer). However, this incurs a computational price and comes at the expense of breaking the inductive bias provided by the input graph structure. In this post, we describe two novel mechanisms to overcome over-squashing while offsetting the side effects of static rewiring approaches: dynamic rewiring and delayed message passing. These techniques can be incorporated into any MPNN and lead to better…

Michael Bronstein
TDS Archive
Published in
9 min readJun 19, 2023

--

--

--

TDS Archive
TDS Archive

Published in TDS Archive

An archive of data science, data analytics, data engineering, machine learning, and artificial intelligence writing from the former Towards Data Science Medium publication.

Michael Bronstein
Michael Bronstein

Written by Michael Bronstein

DeepMind Professor of AI @Oxford. Serial startupper. ML for graphs, biochemistry, drug design, and animal communication.

No responses yet