bce_with_logits_loss

Function bce_with_logits_loss 

Source
pub fn bce_with_logits_loss<B>(
    logits: &Tensor<B>,
    target: &Tensor<B>,
) -> Result<Tensor<B>, Error>
where B: Backend,
Expand description

Binary Cross-Entropy with Logits (numerically stable).

Combines sigmoid + BCE in a single formula: loss = mean(max(x, 0) - x*t + log(1 + exp(-|x|)))

This is numerically stable for any logit value.

ยงArguments

  • logits: raw scores (before sigmoid), any shape
  • target: binary targets in {0, 1}, same shape