torch_geometric.nn.conv.AntiSymmetricConv

class AntiSymmetricConv(in_channels: int, phi: Optional[MessagePassing] = None, num_iters: int = 1, epsilon: float = 0.1, gamma: float = 0.1, act: Optional[Union[str, Callable]] = 'tanh', act_kwargs: Optional[Dict[str, Any]] = None, bias: bool = True)[source]

Bases: Module

来自“Anti-Symmetric DGN: a stable architecture for Deep Graph Networks”论文的反对称图卷积算子。

\[\mathbf{x}^{\prime}_i = \mathbf{x}_i + \epsilon \cdot \sigma \left( (\mathbf{W}-\mathbf{W}^T-\gamma \mathbf{I}) \mathbf{x}_i + \Phi(\mathbf{X}, \mathcal{N}_i) + \mathbf{b}\right),\]

其中 \(\Phi(\mathbf{X}, \mathcal{N}_i)\) 表示一个 MessagePassing 层。

Parameters:
  • in_channels (int) – Size of each input sample.

  • phi (MessagePassing, optional) – 消息传递模块 \(\Phi\)。如果设置为 None,将使用 GCNConv 层作为默认值。 (默认值: None)

  • num_iters (int, optional) – 反称深度图网络运算符被调用的次数。(默认值:1

  • epsilon (float, optional) – 离散化步长 \(\epsilon\). (默认: 0.1)

  • gamma (float, optional) – 扩散强度 \(\gamma\)。 它调节方法的稳定性。(默认值:0.1

  • act (str, 可选) – 非线性激活函数 \(\sigma\), 例如, "tanh""relu". (默认: "tanh")

  • act_kwargs (Dict[str, Any], optional) – Arguments passed to the respective activation function defined by act. (default: None)

  • bias (bool, optional) – If set to False, the layer will not learn an additive bias. (default: True)

Shapes:
  • 输入: 节点特征 \((|\mathcal{V}|, F_{in})\), 边索引 \((2, |\mathcal{E}|)\), 边权重 \((|\mathcal{E}|)\) (可选)

  • output: node features \((|\mathcal{V}|, F_{in})\)

forward(x: Tensor, edge_index: Union[Tensor, SparseTensor], *args, **kwargs) Tensor[source]

运行模块的前向传播。

Return type:

Tensor

reset_parameters()[source]

重置模块的所有可学习参数。