hinge_loss: SVM hinge loss#

hinge_loss#

hypercoil.loss.hinge_loss(Y_hat: Tensor, Y: Tensor, *, key: PRNGKey | None = None) Tensor[source]#

Hinge loss function.

This is the loss function used in support vector machines. It is a special case of constraint_violation() or unilateral_loss() where the inputs are transformed according to the following:

\[1 - Y \hat{Y}\]

HingeLoss#

class hypercoil.loss.HingeLoss(nu: float = 1.0, name: str | None = None, *, scalarisation: Callable | None = None, key: 'jax.random.PRNGKey' | None = None)[source]#

Hinge loss function.

This is the loss function used in support vector machines. It is a special case of constraint_violation() or unilateral_loss() where the inputs are transformed according to the following:

\[1 - Y \hat{Y}\]
Parameters:
name: str

Designated name of the loss function. It is not required that this be specified, but it is recommended to ensure that the loss function can be identified in the context of a reporting utilities. If not explicitly specified, the name will be inferred from the class name and the name of the scoring function.

nu: float

Loss strength multiplier. This is a scalar multiplier that is applied to the loss value before it is returned. This can be used to modulate the relative contributions of different loss functions to the overall loss value. It can also be used to implement a schedule for the loss function, by dynamically adjusting the multiplier over the course of training.

scalarisation: Callable

The scalarisation function to be used to aggregate the values returned by the scoring function. This function should take a single argument, which is a tensor of arbitrary shape, and return a single scalar value. By default, the sum scalarisation is used.

Methods

__call__(Y_hat, Y, *[, key])

Call self as a function.