Skip to main content

Introduction to Chebyshev Graph Convolutional Neural Networks

This article provides a brief overview of Chebyshev Graph Convolutional Neural Networks (ChebGCN) which solves the orthogonality problem of LapGCNs, complete with code examples in PyTorch Geometric and interactive visualizations using W&B.
Created on February 19|Last edited on June 28
TL;DR Graph Spectral Convolutions with Chebyshev Polynomials
💡
In this article we'll go over the Chebyshev Graph Convolutional Neural Networks (ChebGCN) architecture an important spectral graph convolution method by Michaël Defferrard, Xavier Bresson and Pierre Vandergheynst. Inspired by spectral graph theory the authors provide efficient numerical schemes to design fast localized convolutional filters on graphs.
As opposed to spatial graphical neural networks, these networks rely on formulations derived from formal graph theory wherein similar features are identified with localized convolutional filters or kernels which are independent of their spatial locations.
To read more about Spatial Graph Neural Networks, refer to the following articles:


Table of Contents





Method

But why spectral convolutions ? What do they offer that spatial methods don't ? Spatial convolutions on graphs provide filter localization via the finite size of the kernel. However, they face the challenge of matching local neighbourhoods. Also a more important fact is that there is no unique mathematical definition of translation on graphs from a spatial perspective 😅 thereby limiting its effectiveness.
On the other hand a spectral approach provides a well-defined localization operator on graphs via convolutions with a "Kronecker delta" implemented in the spectral domain. Essentially a function with two parameters that equals 1 if the parameters are the same and zero otherwise.
Most Graph Spectral Convolutions define a filter in the Fourier Basis of the Graph (Transformed version of the Graph Laplacian) which results in a filter that is not naturally localized and translations are costly due to O(n2)\mathcal{O}(n^2) multiplication with the graph Fourier basis. The authors overcome these limitations by a special choice of filter parametrization.
We cannot express a meaningful translation operator in the vertex domain, which is why we resort to using the fourier basis of the graph, historically these were non-parametric in nature. Mainly because a parametric feature would lead to filters that are not localized in space and their learning complexity is in O(n)\mathcal{O} (n), the dimensionality of the data. These issues can be overcome by using a "chebyshev" polynomial, which allows for a recursive definition of the filters allowing for determination of parameters in O(1)\mathcal{O}(1).
Also. noteworthy is there graph pooling method defined in Section 2.3 of the paper.

Code

PyTorch Geometric provides an implementation of the chebyshev spectral graph convolutional operator outlined in the paper (ChebConv).
Let's walk through an example implementation of a Graph Convolutional Network using said layer.
import torch.nn.functional as F
from torch_geometric.nn import ChebConv

class ChebNet(torch.nn.Module):
def __init__(self, num_features, latent_dim, num_classes, num_hops):
super().__init__()
self.conv1 = ChebConv(num_features, latent_dim, num_hops)
self.conv2 = ChebConv(latent_dim, num_classes, num_hops)

def reset_parameters(self):
self.conv1.reset_parameters()
self.conv2.reset_parameters()

def forward(self, data):
x, edge_index = data.x, data.edge_index
x = F.relu(self.conv1(x, edge_index))
x = F.dropout(x, p=0.5, training=self.training)
x = self.conv2(x, edge_index)
return F.log_softmax(x, dim=1)
NOTE: If you've read through other GCN implementations from our GCN series, you'll notice that this snippet looks the same with just a separate convolution layer. That's the great thing about PyTorch Geometric, they provide plug and play layers which you can swap out for the desired layer.
💡

Result

Let's compare training the aforementioned ChebNet architecture with varying latent dimensions on the Cora Dataset.

Run set
3


Summary

In this article, we learned about the Chebyshev Graph Convolutional Neural Networks (ChebGCN), a popular spectral graph convolution method, along with code and interactive visualizations.
If you want more reports covering graph neural networks with code implementations, let us know in the comments below or on our forum ✨!
Check out these other reports on Fully Connected covering other Graph Neural Networks-based topics and ideas.
To see the full suite of W&B features, please check out this short 5 minutes guide.