GINConvΒΆ
-
class
dgl.nn.mxnet.conv.
GINConv
(apply_func, aggregator_type, init_eps=0, learn_eps=False)[source]ΒΆ Bases:
mxnet.gluon.block.Block
Graph Isomorphism layer from How Powerful are Graph Neural Networks?
\[h_i^{(l+1)} = f_\Theta \left((1 + \epsilon) h_i^{l} + \mathrm{aggregate}\left(\left\{h_j^{l}, j\in\mathcal{N}(i) \right\}\right)\right)\]- Parameters
apply_func (callable activation function/layer or None) β If not None, apply this function to the updated node feature, the \(f_\Theta\) in the formula.
aggregator_type (str) β Aggregator type to use (
sum
,max
ormean
).init_eps (float, optional) β Initial \(\epsilon\) value, default:
0
.learn_eps (bool, optional) β If True, \(\epsilon\) will be a learnable parameter. Default:
False
.
Example
>>> import dgl >>> import numpy as np >>> import mxnet as mx >>> from mxnet import gluon >>> from dgl.nn import GINConv >>> >>> g = dgl.graph(([0,1,2,3,2,5], [1,2,3,4,0,3])) >>> feat = mx.nd.ones((6, 10)) >>> lin = gluon.nn.Dense(10) >>> lin.initialize(ctx=mx.cpu(0)) >>> conv = GINConv(lin, 'max') >>> conv.initialize(ctx=mx.cpu(0)) >>> res = conv(g, feat) >>> res [[ 0.44832918 -0.05283341 0.20823681 0.16020004 0.37311912 -0.03372726 -0.05716725 -0.20730163 0.14121324 0.46083626] [ 0.44832918 -0.05283341 0.20823681 0.16020004 0.37311912 -0.03372726 -0.05716725 -0.20730163 0.14121324 0.46083626] [ 0.44832918 -0.05283341 0.20823681 0.16020004 0.37311912 -0.03372726 -0.05716725 -0.20730163 0.14121324 0.46083626] [ 0.44832918 -0.05283341 0.20823681 0.16020004 0.37311912 -0.03372726 -0.05716725 -0.20730163 0.14121324 0.46083626] [ 0.44832918 -0.05283341 0.20823681 0.16020004 0.37311912 -0.03372726 -0.05716725 -0.20730163 0.14121324 0.46083626] [ 0.22416459 -0.0264167 0.10411841 0.08010002 0.18655956 -0.01686363 -0.02858362 -0.10365082 0.07060662 0.23041813]] <NDArray 6x10 @cpu(0)>
-
forward
(graph, feat)[source]ΒΆ Compute Graph Isomorphism Network layer.
- Parameters
graph (DGLGraph) β The graph.
feat (mxnet.NDArray or a pair of mxnet.NDArray) β If a mxnet.NDArray is given, the input feature of shape \((N, D_{in})\) where \(D_{in}\) is size of input feature, \(N\) is the number of nodes. If a pair of mxnet.NDArray is given, the pair must contain two tensors of shape \((N_{in}, D_{in})\) and \((N_{out}, D_{in})\). If
apply_func
is not None, \(D_{in}\) should fit the input dimensionality requirement ofapply_func
.
- Returns
The output feature of shape \((N, D_{out})\) where \(D_{out}\) is the output dimensionality of
apply_func
. Ifapply_func
is None, \(D_{out}\) should be the same as input dimensionality.- Return type
mxnet.NDArray