TDS Archive

An archive of data science, data analytics, data engineering, machine learning, and artificial intelligence writing from the former Towards Data Science Medium publication.

Member-only story

New GNN architecture

Co-operative Graph Neural Networks

A large majority of Graph Neural Networks (GNNs) follow the message-passing paradigm, where node states are updated based on aggregated neighbour messages. In this post, we describe Co-operative GNNs (Co-GNNs), a new type of message-passing architecture where every node is viewed as a player that can choose to either ‘listen’, ‘broadcast’, ‘listen & broadcast’, or to ‘isolate.’ Standard message passing is a special case where every node ‘listens & broadcasts’ to all neighbours. We show that Co-GNNs are asynchronous, more expressive, and can address common plights of standard message-passing GNNs such as over-squashing and over-smoothing.

11 min readDec 6, 2023

--

Illustration of node actions in Co-GNNs: standard, listen, broadcast, and isolate. Image credit: DALL-E 3.

This post was co-authored with Ben Finkelshtein, Ismail Ceylan, and Xingyue Huang and is based on the paper B. Finkelshtein et al., Cooperative Graph Neural Networks (2023) arXiv:2310.01267.

--

--

TDS Archive
TDS Archive

Published in TDS Archive

An archive of data science, data analytics, data engineering, machine learning, and artificial intelligence writing from the former Towards Data Science Medium publication.

Michael Bronstein
Michael Bronstein

Written by Michael Bronstein

DeepMind Professor of AI @Oxford. Serial startupper. ML for graphs, biochemistry, drug design, and animal communication.

Responses (2)