dgl.nn (PyTorch)

Conv Layers

GraphConv

Graph convolutional layer from Semi-Supervised Classification with Graph Convolutional Networks

EdgeWeightNorm

This module normalizes positive scalar edge weights on a graph following the form in GCN.

RelGraphConv

Relational graph convolution layer from Modeling Relational Data with Graph Convolutional Networks

TAGConv

Topology Adaptive Graph Convolutional layer from Topology Adaptive Graph Convolutional Networks

GATConv

Graph attention layer from Graph Attention Network

GATv2Conv

GATv2 from How Attentive are Graph Attention Networks?

EGATConv

Graph attention layer that handles edge features from Rossmann-Toolbox (see supplementary data)

EdgeGATConv

Graph attention layer with edge features from SCENE

EdgeConv

EdgeConv layer from Dynamic Graph CNN for Learning on Point Clouds

SAGEConv

GraphSAGE layer from Inductive Representation Learning on Large Graphs

SGConv

SGC layer from Simplifying Graph Convolutional Networks

APPNPConv

Approximate Personalized Propagation of Neural Predictions layer from Predict then Propagate: Graph Neural Networks meet Personalized PageRank

GINConv

Graph Isomorphism Network layer from How Powerful are Graph Neural Networks?

GINEConv

Graph Isomorphism Network with Edge Features, introduced by Strategies for Pre-training Graph Neural Networks

GatedGraphConv

Gated Graph Convolution layer from Gated Graph Sequence Neural Networks

GatedGCNConv

Gated graph convolutional layer from Benchmarking Graph Neural Networks

GMMConv

Gaussian Mixture Model Convolution layer from Geometric Deep Learning on Graphs and Manifolds using Mixture Model CNNs

ChebConv

Chebyshev Spectral Graph Convolution layer from Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering

AGNNConv

Attention-based Graph Neural Network layer from Attention-based Graph Neural Network for Semi-Supervised Learning

NNConv

Graph Convolution layer from Neural Message Passing for Quantum Chemistry

AtomicConv

Atomic Convolution Layer from Atomic Convolutional Networks for Predicting Protein-Ligand Binding Affinity

CFConv

CFConv from SchNet: A continuous-filter convolutional neural network for modeling quantum interactions

DotGatConv

Apply dot product version of self attention in Graph Attention Network

TWIRLSConv

Convolution together with iteratively reweighting least squre from Graph Neural Networks Inspired by Classical Iterative Algorithms

TWIRLSUnfoldingAndAttention

Description Combine propagation and attention together.

GCN2Conv

Graph Convolutional Network via Initial residual and Identity mapping (GCNII) from Simple and Deep Graph Convolutional Networks

HGTConv

Heterogeneous graph transformer convolution from Heterogeneous Graph Transformer

GroupRevRes

Grouped reversible residual connections for GNNs, as introduced in Training Graph Neural Networks with 1000 Layers

EGNNConv

Equivariant Graph Convolutional Layer from E(n) Equivariant Graph Neural Networks

PNAConv

Principal Neighbourhood Aggregation Layer from Principal Neighbourhood Aggregation for Graph Nets

DGNConv

Directional Graph Network Layer from Directional Graph Networks

CuGraph Conv Layers

CuGraphRelGraphConv

An accelerated relational graph convolution layer from Modeling Relational Data with Graph Convolutional Networks that leverages the highly-optimized aggregation primitives in cugraph-ops.

CuGraphGATConv

Graph attention layer from Graph Attention Networks, with the sparse aggregation accelerated by cugraph-ops.

CuGraphSAGEConv

An accelerated GraphSAGE layer from Inductive Representation Learning on Large Graphs that leverages the highly-optimized aggregation primitives in cugraph-ops:

Dense Conv Layers

DenseGraphConv

Graph Convolutional layer from Semi-Supervised Classification with Graph Convolutional Networks

DenseSAGEConv

GraphSAGE layer from Inductive Representation Learning on Large Graphs

DenseChebConv

Chebyshev Spectral Graph Convolution layer from Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering

Global Pooling Layers

SumPooling

Apply sum pooling over the nodes in a graph.

AvgPooling

Apply average pooling over the nodes in a graph.

MaxPooling

Apply max pooling over the nodes in a graph.

SortPooling

Sort Pooling from An End-to-End Deep Learning Architecture for Graph Classification

WeightAndSum

Compute importance weights for atoms and perform a weighted sum.

GlobalAttentionPooling

Global Attention Pooling from Gated Graph Sequence Neural Networks

Set2Set

Set2Set operator from Order Matters: Sequence to sequence for sets

SetTransformerEncoder

The Encoder module from Set Transformer: A Framework for Attention-based Permutation-Invariant Neural Networks

SetTransformerDecoder

The Decoder module from Set Transformer: A Framework for Attention-based Permutation-Invariant Neural Networks

Heterogeneous Learning Modules

HeteroGraphConv

A generic module for computing convolution on heterogeneous graphs.

HeteroLinear

Apply linear transformations on heterogeneous inputs.

HeteroEmbedding

Create a heterogeneous embedding table.

TypedLinear

Linear transformation according to types.

Utility Modules

Sequential

A sequential container for stacking graph neural network modules

WeightBasis

Basis decomposition from Modeling Relational Data with Graph Convolutional Networks

KNNGraph

Layer that transforms one point set into a graph, or a batch of point sets with the same number of points into a batched union of those graphs.

SegmentedKNNGraph

Layer that transforms one point set into a graph, or a batch of point sets with different number of points into a batched union of those graphs.

RadiusGraph

Layer that transforms one point set into a bidirected graph with neighbors within given distance.

JumpingKnowledge

The Jumping Knowledge aggregation module from Representation Learning on Graphs with Jumping Knowledge Networks

NodeEmbedding

Class for storing node embeddings.

GNNExplainer

GNNExplainer model from GNNExplainer: Generating Explanations for Graph Neural Networks

HeteroGNNExplainer

GNNExplainer model from GNNExplainer: Generating Explanations for Graph Neural Networks, adapted for heterogeneous graphs

SubgraphX

SubgraphX from On Explainability of Graph Neural Networks via Subgraph Explorations <https://arxiv.org/abs/2102.05152>

HeteroSubgraphX

SubgraphX from On Explainability of Graph Neural Networks via Subgraph Explorations, adapted for heterogeneous graphs

PGExplainer

PGExplainer from Parameterized Explainer for Graph Neural Network <https://arxiv.org/pdf/2011.04573>

HeteroPGExplainer

PGExplainer from Parameterized Explainer for Graph Neural Network, adapted for heterogeneous graphs

LabelPropagation

Label Propagation from Learning from Labeled and Unlabeled Data with Label Propagation

Network Embedding Modules

DeepWalk

DeepWalk module from DeepWalk: Online Learning of Social Representations

MetaPath2Vec

metapath2vec module from metapath2vec: Scalable Representation Learning for Heterogeneous Networks

Utility Modules for Graph Transformer

DegreeEncoder

Degree Encoder, as introduced in Do Transformers Really Perform Bad for Graph Representation?

LapPosEncoder

Laplacian Positional Encoder (LPE), as introduced in GraphGPS: General Powerful Scalable Graph Transformers

PathEncoder

Path Encoder, as introduced in Edge Encoding of Do Transformers Really Perform Bad for Graph Representation?

SpatialEncoder

Spatial Encoder, as introduced in Do Transformers Really Perform Bad for Graph Representation?

SpatialEncoder3d

3D Spatial Encoder, as introduced in One Transformer Can Understand Both 2D & 3D Molecular Data

BiasedMHA

Dense Multi-Head Attention Module with Graph Attention Bias.

GraphormerLayer

Graphormer Layer with Dense Multi-Head Attention, as introduced in Do Transformers Really Perform Bad for Graph Representation?

EGTLayer

EGTLayer for Edge-augmented Graph Transformer (EGT), as introduced in `Global Self-Attention as a Replacement for Graph Convolution Reference `<https://arxiv.org/pdf/2108.03348.pdf>`_