UBC, Google & Amii’s Exphormer: Scaling Graph Transformers While Slashing Costs
Published in
3 min readMar 16
--
The ability of graph transformers (GT) to model long-range interactions has improved expressivity and made such architectures a promising alternative to traditional graph neural network (GNN) approaches. The GT downside is their poor scalability, which renders them prohibitively expensive when dealing with large and complex graphs.