Loss functions¶
-
enum
elsa::ml
::
LossReduction
¶ Reduction types for loss functions.
Values:
-
enumerator
Sum
¶ reduce loss by summing up across batches
-
enumerator
SumOverBatchSize
¶ reduce loss by summing up over batches
-
enumerator
-
template<typename
data_t
= real_t>
classelsa::ml
::
Loss
¶ Base class for all loss functions
- Author
David Tellenbach
Public Functions
-
Loss
() = default¶ default constructor
-
~Loss
() = default¶ virtual destructor
-
data_t
getLoss
(const DataContainer<data_t> &x, const DataContainer<data_t> &y) const¶ - Return
the loss between predictions x and labels y
-
data_t
operator()
(const DataContainer<data_t> &x, const DataContainer<data_t> &y)¶ - Return
the loss between predictions x and labels y
-
DataContainer<data_t>
getLossGradient
(const DataContainer<data_t> &x, const DataContainer<data_t> &y) const¶ - Return
the loss-gradient between predictions x and labels y
-
std::string
getName
() const¶ - Return
the name of this loss function
-
template<typename
data_t
= real_t>
classelsa::ml
::
BinaryCrossentropy
: public elsa::ml::Loss<real_t>¶ Computes the cross-entropy loss between true labels and predicted labels.
Use this cross-entropy loss when there are only two label classes (assumed to be 0 and 1). For each example, there should be a single floating-point value per prediction.
- Author
David Tellenbach
Public Functions
-
BinaryCrossentropy
(LossReduction reduction = LossReduction::SumOverBatchSize)¶ Construct a BinaryCrossEntropy loss by optionally specifying the reduction type.
-
template<typename
data_t
= real_t>
classelsa::ml
::
CategoricalCrossentropy
: public elsa::ml::Loss<real_t>¶ Computes the crossentropy loss between the labels and predictions.
Use this crossentropy loss function when there are two or more label classes. We expect labels to be provided in a one-hot representation. If you don’t want to use one-hot encoded labels, please use
SparseCategoricalCrossentropy loss. There should be # classes floating point values per feature.- Author
David Tellenbach
Public Functions
-
CategoricalCrossentropy
(LossReduction reduction = LossReduction::SumOverBatchSize)¶ Construct a CategoricalCrossentropy loss by optionally specifying the reduction type.
-
template<typename
data_t
= real_t>
classelsa::ml
::
SparseCategoricalCrossentropy
: public elsa::ml::Loss<real_t>¶ Computes the crossentropy loss between the labels and predictions.
Use this crossentropy loss function when there are two or more label classes. If you want to provide labels using one-hot representation, please use
CategoricalCrossentropy loss. There should be # classes floating point values per feature for x and a single floating point value per feature for y.- Author
David Tellenbach
Public Functions
-
SparseCategoricalCrossentropy
(LossReduction reduction = LossReduction::SumOverBatchSize)¶ Construct a SparseCategoricalCrossentropy loss by optionally specifying the reduction type.
-
template<typename
data_t
= real_t>
classelsa::ml
::
MeanSquaredError
: public elsa::ml::Loss<real_t>¶ Computes the mean squared error between labels y and predictions x:
\[ \text{loss} = (y - x)^2 \]- Author
David Tellenbach
Public Functions
-
MeanSquaredError
(LossReduction reduction = LossReduction::SumOverBatchSize)¶ Construct a MeanSquaredError loss by optionally specifying the reduction type.