besskge.loss.SampledSoftmaxCrossEntropyLoss

class besskge.loss.SampledSoftmaxCrossEntropyLoss(n_entity, loss_scale=1.0)[source]

The sampled softmax cross-entropy loss (see [JCMB15] and [CJM+22]).

Initialize the sampled softmax cross-entropy loss.

Parameters:
forward(positive_score, negative_score, triple_weight)[source]

Compute batch loss.

Parameters:
  • positive_score (Tensor) – shape: (batch_size,) Scores of positive triples.

  • negative_score (Tensor) – shape: (batch_size, n_negative) Scores of negative triples.

  • triple_weight (Tensor) – shape: (batch_size,) or () Weights of positive triples.

Return type:

Tensor

Returns:

The batch loss.

get_negative_weights(negative_score)

Construct weights of negative samples, based on their score.

Parameters:

negative_score (Tensor) – : (batch_size, n_negative) Scores of negative samples.

Return type:

Tensor

Returns:

shape: (batch_size, n_negative) if BaseLossFunction.negative_adversarial_sampling else () Weights of negative samples.

loss_scale: Tensor

Loss scaling factor, might be needed when using FP16 weights

negative_adversarial_sampling: bool

Use self-adversarial weighting of negative samples.

negative_adversarial_scale: Tensor

Reciprocal temperature of self-adversarial weighting