EmbeddingBag¶
- class torch.ao.nn.quantized.EmbeddingBag(num_embeddings, embedding_dim, max_norm=None, norm_type=2.0, scale_grad_by_freq=False, mode='sum', sparse=False, _weight=None, include_last_offset=False, dtype=torch.quint8)[源代码]¶
一个使用量化打包权重作为输入的量化EmbeddingBag模块。 我们采用了与torch.nn.EmbeddingBag相同的接口,请参阅 https://pytorch.org/docs/stable/nn.html#torch.nn.EmbeddingBag获取文档。
类似于
EmbeddingBag,属性将在模块创建时随机初始化,并将在之后被覆盖- Variables
权重 (张量) – 模块的不可学习的量化权重,形状为 。
- Examples::
>>> m = nn.quantized.EmbeddingBag(num_embeddings=10, embedding_dim=12, include_last_offset=True, mode='sum') >>> indices = torch.tensor([9, 6, 5, 7, 8, 8, 9, 2, 8, 6, 6, 9, 1, 6, 8, 8, 3, 2, 3, 6, 3, 6, 5, 7, 0, 8, 4, 6, 5, 8, 2, 3]) >>> offsets = torch.tensor([0, 19, 20, 28, 28, 32]) >>> output = m(indices, offsets) >>> print(output.size()) torch.Size([5, 12])