torch_geometric.nn.models.AttentiveFP

class AttentiveFP(in_channels: int, hidden_channels: int, out_channels: int, edge_dim: int, num_layers: int, num_timesteps: int, dropout: float = 0.0)[source]

Bases: Module

Attentive FP模型用于分子表示学习,源自 “Pushing the Boundaries of Molecular Representation for Drug Discovery with the Graph Attention Mechanism” 论文,基于 图注意力机制。

Parameters:
  • in_channels (int) – Size of each input sample.

  • hidden_channels (int) – 隐藏节点特征的维度。

  • out_channels (int) – Size of each output sample.

  • edge_dim (int) – 边缘特征维度。

  • num_layers (int) – GNN层的数量。

  • num_timesteps (int) – 全局读取的迭代细化步骤数。

  • dropout (float, optional) – Dropout概率。(默认值:0.0

forward(x: Tensor, edge_index: Tensor, edge_attr: Tensor, batch: Tensor) Tensor[source]
Return type:

Tensor

reset_parameters()[source]

重置模块的所有可学习参数。