APPNPConv

class dgl.nn.mxnet.conv.APPNPConv(k, alpha, edge_drop=0.0)[source]

Bases: mxnet.gluon.block.Block

Approximate Personalized Propagation of Neural Predictions layer from Predict then Propagate: Graph Neural Networks meet Personalized PageRank

\[ \begin{align}\begin{aligned}H^{0} &= X\\H^{l+1} &= (1-\alpha)\left(\tilde{D}^{-1/2} \tilde{A} \tilde{D}^{-1/2} H^{l}\right) + \alpha H^{0}\end{aligned}\end{align} \]

where \(\tilde{A}\) is \(A\) + \(I\).

Parameters
  • k (int) – The number of iterations \(K\).

  • alpha (float) – The teleport probability \(\alpha\).

  • edge_drop (float, optional) – The dropout rate on edges that controls the messages received by each node. Default: 0.

Example

>>> import dgl
>>> import numpy as np
>>> import mxnet as mx
>>> from dgl.nn import APPNPConv
>>>
>>> g = dgl.graph(([0,1,2,3,2,5], [1,2,3,4,0,3]))
>>> feat = mx.nd.ones((6, 10))
>>> conv = APPNPConv(k=3, alpha=0.5)
>>> conv.initialize(ctx=mx.cpu(0))
>>> res = conv(g, feat)
>>> res
[[1.         1.         1.         1.         1.         1.
1.         1.         1.         1.        ]
[1.         1.         1.         1.         1.         1.
1.         1.         1.         1.        ]
[1.         1.         1.         1.         1.         1.
1.         1.         1.         1.        ]
[1.0303301  1.0303301  1.0303301  1.0303301  1.0303301  1.0303301
1.0303301  1.0303301  1.0303301  1.0303301 ]
[0.86427665 0.86427665 0.86427665 0.86427665 0.86427665 0.86427665
0.86427665 0.86427665 0.86427665 0.86427665]
[0.5        0.5        0.5        0.5        0.5        0.5
0.5        0.5        0.5        0.5       ]]
<NDArray 6x10 @cpu(0)>
forward(graph, feat)[source]

Compute APPNP layer.

Parameters
  • graph (DGLGraph) – The graph.

  • feat (mx.NDArray) – The input feature of shape \((N, *)\). \(N\) is the number of nodes, and \(*\) could be of any shape.

Returns

The output feature of shape \((N, *)\) where \(*\) should be the same as input shape.

Return type

mx.NDArray