loss
: Loss and regularisation#
Loss functions and regularisations.
The loss
submodule is a collection of implementations of differentiable
functions that map from arbitrary inputs to scalar-valued outputs. These
scalar outputs can provide a starting point for a backward pass through a
differentiable program model. Functionality is provided for various measures
of interest to functional brain mapping and other contexts.
Helper wrappers allow packaging of multiple loss objectives in a single call. Each wrapped objective can be selectively applied to a subset of input tensors using LossApply, LossArgument, and LossScheme functionality.
Loss
: Base class for scalar-valued lossesParameterisedLoss
: Extensible class for custom parameterised lossesMSELoss
: Mean squared errorNormedLoss
: Normed parameter regularisationidentity
: Identity functionzero
: Zero functiondifference
: Elementwise differenceconstraint_violation
: Soft constraintsunilateral_loss
: Unilateral penaltieshinge_loss
: SVM hinge losssmoothness
: Backwards differencesbimodal_symmetric
: Minimal distance from 2 modesdet_gram
: Gramian determinantlog_det_gram
: Gram log-determinant lossentropy
: Categorical entropykl_divergence
: Kullback-Leibler divergencejs_divergence
: Jensen-Shannon divergencebregman_divergence
: Bregman divergencesequilibrium
: Equilibrium losssecond_moment
: Second momentsauto_tol
: Significance tolerancebatch_corr
: Batch-axis correlationqcfc
: QC-FC measures and lossreference_tether
: Spatial tether to reference pointsinterhemispheric_tether
: Inter-hemispheric tethering losscompactness
: Compactnessdispersion
: Vector dispersionmultivariate_kurtosis
: Time series stationarityconnectopy
: Generalised connectopymodularity
: Relaxed modularitysum_scalarise
: Sum scalarisationmean_scalarise
: Mean scalarisationmeansq_scalarise
: Squared mean scalarisationmax_scalarise
: Maximum-value scalarisationnorm_scalarise
: Norm scalarisationvnorm_scalarise
: Vector norm scalarisationwmean_scalarise
: Weighted mean scalarisationselfwmean_scalarise
: Self-weighted mean scalarisationLossApply
: Selectively apply loss to parametersLossScheme
: Scheme for multiple losses