Introduction to Graph Neural Networks
Interested in Graph Neural Networks and want a roadmap on how to get started? In this article, we'll give a brief outline of the field and share blogs and resources!
Created on January 22|Last edited on March 3
Comment
Over the past few years, there's been emerging interest in the use of graph data in machine learning. Why is that? Well, graphs provide an excellent mathematical structure to represent molecules (leading to groundbreaking research like AlphaFold) and other networks of various kinds. They've also recently emerged as the key meta-structure for everything, i.e. modalities such as vision, text and speech can be seen as special cases of graphs and thus Graph Representation Learning has become increasingly important.
In this article we will provide a brief overview of the field based on the excellent survey paper Everything is Connected: Graph Neural Networks by Petar Veličković and share key ideas, important notation and links to other relevant blogs.

Table of Contents
Important Notation

- Let's start by defining a simple and modular description of a graph. A graph is defined as a tuple of sets of a set of nodes and a set of edges consisting of pairs of nodes that are connected. Both nodes and edges can have properties which are of interest to us.
- Every node is said to have a - dimensional feature . If we stack the features of all the nodes we get the (node) feature matrix .
- There are a number of ways in which we could store information about the edges , the most common method is to use a adjacency matrix , where
- The graph structure also allows for the edges to have some locality around them. This is usually described as the nodes surrounding or in the neighborhood of some node
And the multiset of all neighborhood features, can be defined as:
- We define a local function i.e. the function which computes the features of a local region of the graph as
Equivariance and Invariance
Most modern day geometric deep learning relies on exploiting the underlying symmetry of our natural world. These symmetries are exploited using various properties, we shall discuss two such properties today by taking the example of permutation as a symmetric operation:
- Invariance: "shuffling the input doesn't change the outputs" a function is said to be permutation invariant if by shuffling the order of the inputs, the output still remains the same. In the case of graphs this is realised by using some permutation matrix to change the order of the nodes and edges i.e. by permuting the adjacency matrix . Thus, permutation invariance can be formalized as:
- Equivariance: "shuffling the input also shuffles the output" a function is said to be equivariant if by shuffling the order of the inputs, the output also gets shuffled. Similarly as above, we can formalise permutation equivariance as:
There are other symmetries that also observed such as shift and rotation, which have profound implications in domains such as vision and molecular modeling.
Graph Neural Networks
Defining the aforementioned local function is of key importance in graph representation learning and much of the field revolves around defining good permutation invariant local functions , which exhibit key symmetry and computational properties. Most methods can be grouped into three broad classes namely Graph Convolutional Networks, Graph Attentional Networks and Message Passing Graph Networks. To know more I'd recommend going through these introductory articles and some key methods in each family
- Convolutional
2. Attentional
3. Message Passing
Graph Convolutional Networks (GCN)
An Introduction to Convolutional Graph Neural Networks
This article provides a beginner-friendly introduction to Convolutional Graph Neural Networks (GCNs), which apply deep learning paradigms to graphical data.
A Brief Introduction to Residual Gated Graph Convolutional Networks
This article provides a brief overview of the Residual Gated Graph Convolutional Network architecture, complete with code examples in PyTorch Geometric and interactive visualizations using W&B.
An Introduction to GraphSAGE
This article provides an overview of the GraphSAGE neural network architecture, complete with code examples in PyTorch Geometric, and visualizations using W&B.
Graph Attentional Networks (GAT)
A Brief Introduction to Graph Attention Networks
This article provides a brief overview of the Graph Attention Networks architecture, complete with code examples in PyTorch Geometric and interactive visualizations using W&B.
A Brief Introduction to Mixture Model Networks (MoNet)
This article provides an overview of the Mixture Model Networks (MoNet) architecture, with code examples in PyTorch Geometric and interactive visualizations using W&B.
An Introduction to Graph Attention Networks
This article provides a beginner-friendly introduction to Attention based Graphical Neural Networks (GATs), which apply deep learning paradigms to graphical data.
Message Passing Graph Neural Networks (MPGNN)
What are Graph Isomorphism Networks?
This article provides a brief overview of Graph Isomorphism Networks (GIN), complete with code examples in PyTorch Geometric and interactive visualizations using W&B.
Graph Neural Networks (GNNs) with Learnable Structural and Positional Representations
An in-depth breakdown of "Graph Neural Networks with Learnable Structural and Positional Representations" by Vijay Prakash Dwivedi, Anh Tuan Luu, Thomas Laurent, Yoshua Bengio and Xavier Bresson.
An Introduction to Message Passing Graph Neural Networks
This article provides a beginner-friendly introduction to Message Passing Graph Neural Networks (MPGNNs), which apply deep learning paradigms to graphical data.
Summary
In this article we attempted to provide a simple notation for graphs and introduce key methods in Graph Representation Learning. We also looked at some important concepts such as Permutation Invariance and Equivariance.
If you want more reports covering graph neural networks with code implementations, let us know in the comments below or on our community discord ✨!
Check out these other reports on Fully Connected covering other Graph Neural Networks-based topics and ideas.
Further Reading
A Brief Introduction to Graph Contrastive Learning
This article provides an overview of "Deep Graph Contrastive Representation Learning" and introduces a general formulation for Contrastive Representation Learning on Graphs using W&B for interactive visualizations. It includes code samples for you to follow!
GraphCL: Graph Contrastive Learning Framework with Augmentations
Graph Contrastive Learning Framework as outlined in "Graph Contrastive Learning with Augmentations" by You. et al.
Multi-view Graph Representation Learning
Easy to digest breakdown of "Contrastive Multi-View Representation Learning on Graphs" by Kaveh Hassani and Amir Hosein Khasahmadi
Multi-Task Self Supervised Graph Representation Learning
Brief breakdown of Multi-task Self-supervised Graph Neural Network Enable Stronger Task Generalization [ICLR 2023] by Mingxuan Ju, Tong Zhao, Qianlong Wen, Wenhao Yu, Neil Shah, Yanfang Ye and Chuxu Zhang
Add a comment
Iterate on AI agents and models faster. Try Weights & Biases today.