ner.nn.embeddings.embedding module#

class ner.nn.embeddings.embedding.TokenEmbedding(vocab_size: int, embedding_dim: int, padding_idx: int = 0)#

Bases: Module

Initializes token embeddings (one for each token in the vocabulary) using the given parameters.

Parameters:
vocab_sizeint

The size of the vocabulary; vocab_size total embeddings are initialized using Embedding.

embedding_dimint

The required embedding dimension.

padding_idxint

The token index corresponding to padding tokens; the padded token embedding is a vector of all zeros.

forward(input_ids: Tensor) Tensor#

Computes a forward pass to retrieve the token embeddings associated with the input IDs.

Parameters:
input_idstorch.Tensor

A (batched) tensor containing input IDs of shape (batch_size, max_length).

Returns:
torch.Tensor

A tensor of associated embeddings of shape (batch_size, max_length, embedding_dim).