PPRΒΆ
-
class
dgl.transforms.
PPR
(alpha=0.15, eweight_name='w', eps=None, avg_degree=5)[source]ΒΆ Bases:
dgl.transforms.module.BaseTransform
Apply personalized PageRank (PPR) to an input graph for diffusion, as introduced in The pagerank citation ranking: Bringing order to the web.
A sparsification will be applied to the weighted adjacency matrix after diffusion. Specifically, edges whose weight is below a threshold will be dropped.
This module only works for homogeneous graphs.
- Parameters
alpha (float, optional) β Restart probability, which commonly lies in \([0.05, 0.2]\).
eweight_name (str, optional) β
edata
name to retrieve and store edge weights. If it does not exist in an input graph, this module initializes a weight of 1 for all edges. The edge weights should be a tensor of shape \((E)\), where E is the number of edges.eps (float, optional) β The threshold to preserve edges in sparsification after diffusion. Edges of a weight smaller than eps will be dropped.
avg_degree (int, optional) β The desired average node degree of the result graph. This is the other way to control the sparsity of the result graph and will only be effective if
eps
is not given.
Example
>>> import dgl >>> import torch >>> from dgl import PPR
>>> transform = PPR(avg_degree=2) >>> g = dgl.graph(([0, 1, 2, 3, 4], [2, 3, 4, 5, 3])) >>> g.edata['w'] = torch.tensor([0.1, 0.2, 0.3, 0.4, 0.5]) >>> new_g = transform(g) >>> print(new_g.edata['w']) tensor([0.1500, 0.1500, 0.1500, 0.0255, 0.0163, 0.1500, 0.0638, 0.0383, 0.1500, 0.0510, 0.0217, 0.1500])