Classification#

Classification loss functions.

nabla.nn.losses.classification.cross_entropy_loss(logits, targets, axis=-1)[source]#

Compute cross-entropy loss between logits and targets.

Parameters:
  • logits (Array) – Raw model outputs (before softmax) [batch_size, num_classes]

  • targets (Array) – One-hot encoded targets [batch_size, num_classes]

  • axis (int) – Axis along which to compute softmax

Returns:

Scalar loss value

Return type:

Array

nabla.nn.losses.classification.sparse_cross_entropy_loss(logits, targets, axis=-1)[source]#

Compute cross-entropy loss with integer targets.

Parameters:
  • logits (Array) – Raw model outputs [batch_size, num_classes]

  • targets (Array) – Integer class indices [batch_size]

  • axis (int) – Axis along which to compute softmax

Returns:

Scalar loss value

Return type:

Array

nabla.nn.losses.classification.binary_cross_entropy_loss(predictions, targets, eps=1e-07)[source]#

Compute binary cross-entropy loss.

Parameters:
  • predictions (Array) – Model predictions (after sigmoid) [batch_size]

  • targets (Array) – Binary targets (0 or 1) [batch_size]

  • eps (float) – Small constant for numerical stability

Returns:

Scalar loss value

Return type:

Array

nabla.nn.losses.classification.softmax_cross_entropy_loss(logits, targets, axis=-1)[source]#

Compute softmax cross-entropy loss (numerically stable).

This is equivalent to cross_entropy_loss but more numerically stable by combining softmax and cross-entropy computations.

Parameters:
  • logits (Array) – Raw model outputs [batch_size, num_classes]

  • targets (Array) – One-hot encoded targets [batch_size, num_classes]

  • axis (int) – Axis along which to compute softmax

Returns:

Scalar loss value

Return type:

Array