Package de.jungblut.math.activation
Class ReluActivationFunction
- java.lang.Object
-
- de.jungblut.math.activation.AbstractActivationFunction
-
- de.jungblut.math.activation.ReluActivationFunction
-
- All Implemented Interfaces:
ActivationFunction
public final class ReluActivationFunction extends AbstractActivationFunction
Rectified linear units implementation.- Author:
- thomas.jungblut
-
-
Constructor Summary
Constructors Constructor Description ReluActivationFunction()
-
Method Summary
All Methods Instance Methods Concrete Methods Modifier and Type Method Description doubleapply(double input)Applies the activation function on the given element.doublegradient(double input)Applies the gradient of the activation function on the given element.-
Methods inherited from class de.jungblut.math.activation.AbstractActivationFunction
apply, apply, gradient, gradient, newInstance, newInstance, toString
-
-
-
-
Method Detail
-
apply
public double apply(double input)
Description copied from interface:ActivationFunctionApplies the activation function on the given element.
-
gradient
public double gradient(double input)
Description copied from interface:ActivationFunctionApplies the gradient of the activation function on the given element.
-
-