Package de.jungblut.math.loss
Interface LossFunction
-
- All Known Implementing Classes:
CrossEntropyLoss,HingeLoss,LogLoss,MeanAbsoluteLoss,SquaredLoss,StepLoss
public interface LossFunctionCalculates the error, for example in the last layer of a neural net.- Author:
- thomas.jungblut
-
-
Method Summary
All Methods Instance Methods Abstract Methods Modifier and Type Method Description de.jungblut.math.DoubleVectorcalculateGradient(de.jungblut.math.DoubleVector feature, de.jungblut.math.DoubleVector y, de.jungblut.math.DoubleVector hypothesis)Calculate the gradient with the given parameters.doublecalculateLoss(de.jungblut.math.DoubleMatrix y, de.jungblut.math.DoubleMatrix hypothesis)Calculate the error with the given parameters.doublecalculateLoss(de.jungblut.math.DoubleVector y, de.jungblut.math.DoubleVector hypothesis)Calculate the error with the given parameters.
-
-
-
Method Detail
-
calculateLoss
double calculateLoss(de.jungblut.math.DoubleMatrix y, de.jungblut.math.DoubleMatrix hypothesis)Calculate the error with the given parameters.- Parameters:
y- the real outcome as a matrix- rows contain the examples, columns the examples' output.hypothesis- the hypothesis as a matrix- rows contain the examples, columns the predicted output.- Returns:
- a positive value that denotes the error between the two matrices.
-
calculateLoss
double calculateLoss(de.jungblut.math.DoubleVector y, de.jungblut.math.DoubleVector hypothesis)Calculate the error with the given parameters.- Parameters:
y- the real outcome as a vector single example.hypothesis- the hypothesis as a vector single example.- Returns:
- a positive value that denotes the error between the two vectors.
-
calculateGradient
de.jungblut.math.DoubleVector calculateGradient(de.jungblut.math.DoubleVector feature, de.jungblut.math.DoubleVector y, de.jungblut.math.DoubleVector hypothesis)Calculate the gradient with the given parameters.- Parameters:
y- the real outcome as a vector single example.hypothesis- the hypothesis as a vector single example.- Returns:
- a vector that denotes the gradient given the hypothesis and real outcome.
-
-