torch_geometric.nn.models.MetaPath2Vec

class MetaPath2Vec(edge_index_dict: Dict[Tuple[str, str, str], Tensor], embedding_dim: int, metapath: List[Tuple[str, str, str]], walk_length: int, context_size: int, walks_per_node: int = 1, num_negative_samples: int = 1, num_nodes_dict: Optional[Dict[str, int]] = None, sparse: bool = False)[source]

Bases: Module

来自“metapath2vec: 异构网络的可扩展表示学习”论文的MetaPath2Vec模型,其中基于给定的metapath在异构图中进行随机游走采样,并通过负采样优化学习节点嵌入。

注意

有关使用MetaPath2Vec的示例,请参见 examples/hetero/metapath2vec.py

Parameters:
  • edge_index_dict (Dict[Tuple[str, str, str], torch.Tensor]) – 字典 保存每个 (src_node_type, rel_type, dst_node_type) 边类型的边索引 在异质图中。

  • embedding_dim (int) – The size of each embedding vector.

  • metapath (List[Tuple[str, str, str]]) – 元路径描述为一个由(src_node_type, rel_type, dst_node_type)元组组成的列表。

  • walk_length (int) – The walk length.

  • context_size (int) – The actual context size which is considered for positive samples. This parameter increases the effective sampling rate by reusing samples across different source nodes.

  • walks_per_node (int, optional) – The number of walks to sample for each node. (default: 1)

  • num_negative_samples (int, optional) – The number of negative samples to use for each positive sample. (default: 1)

  • num_nodes_dict (Dict[str, int], optional) – 保存每种节点类型的节点数量的字典。(默认值:None

  • sparse (bool, optional) – If set to True, gradients w.r.t. to the weight matrix will be sparse. (default: False)

forward(node_type: str, batch: Optional[Tensor] = None) Tensor[source]

返回类型为node_typebatch中节点的嵌入。

Return type:

Tensor

reset_parameters()[source]

重置模块的所有可学习参数。

loader(**kwargs)[source]

返回在异质图上创建正负随机游走的数据加载器。

Parameters:

**kwargs (可选) – torch.utils.data.DataLoader 的参数,例如 batch_size, shuffle, drop_lastnum_workers.

loss(pos_rw: Tensor, neg_rw: Tensor) Tensor[source]

计算给定正负随机游走的损失。

Return type:

Tensor

test(train_z: Tensor, train_y: Tensor, test_z: Tensor, test_y: Tensor, solver: str = 'lbfgs', *args, **kwargs) float[source]

通过逻辑回归下游任务评估潜在空间的质量。

Return type:

float