Sitemap
TDS Archive

An archive of data science, data analytics, data engineering, machine learning, and artificial intelligence writing from the former Towards Data Science Medium publication.

Follow publication

Image created with Dall·E by the author.

Member-only story

Graph Neural Networks Part 2. Graph Attention Networks vs. Graph Convolutional Networks

A model that pays attention to your graph

8 min readOct 8, 2024

--

Welcome to the second post about GNN architectures! In the previous post, we saw a staggering improvement in accuracy on the Cora dataset by incorporating the graph structure in the model using a Graph Convolutional Network (GCN). This post explains Graph Attention Networks (GATs), another fundamental architecture of graph neural networks. Can we improve the accuracy even further with a GAT?

First, let’s talk about the difference between GATs and GCNs. Then let’s train a GAT and compare the accuracy with the GCN and basic neural network.

This blog post is part of a series. Are you new to GNNs? I recommend you to start with the first post, which explains graphs, neural networks, the dataset, and GCNs.

Graph Attention Networks

In my previous post, we saw a GCN in action. Let’s take it a step further and look at Graph Attention Networks (GATs). As you might remember, GCNs treat all neighbors equally. For GATs, this is different. GATs allow the model to learn different importance (attention) scores for different neighbors. They aggregate neighbor information by using attention mechanisms (this might ring a bell…

--

--

TDS Archive
TDS Archive

Published in TDS Archive

An archive of data science, data analytics, data engineering, machine learning, and artificial intelligence writing from the former Towards Data Science Medium publication.

Hennie de Harder
Hennie de Harder

Written by Hennie de Harder

📈 Data Scientist & ML Engineer 💡 Simplifying complex topics ✨ Sharing fun side projects 💻 Working at IKEA and BigData Republic 🐈 Love math, cats, & running

No responses yet