Package de.jungblut.math.activation
Class SoftplusReluActivationFunction
- java.lang.Object
-
- de.jungblut.math.activation.AbstractActivationFunction
-
- de.jungblut.math.activation.SoftplusReluActivationFunction
-
- All Implemented Interfaces:
ActivationFunction
public final class SoftplusReluActivationFunction extends AbstractActivationFunction
Smoothed approximation to aReluActivationFunction.- Author:
- thomas.jungblut
-
-
Constructor Summary
Constructors Constructor Description SoftplusReluActivationFunction()
-
Method Summary
All Methods Instance Methods Concrete Methods Modifier and Type Method Description doubleapply(double input)Applies the activation function on the given element.doublegradient(double input)Applies the gradient of the activation function on the given element.-
Methods inherited from class de.jungblut.math.activation.AbstractActivationFunction
apply, apply, gradient, gradient, newInstance, newInstance, toString
-
-
-
-
Method Detail
-
apply
public double apply(double input)
Description copied from interface:ActivationFunctionApplies the activation function on the given element.
-
gradient
public double gradient(double input)
Description copied from interface:ActivationFunctionApplies the gradient of the activation function on the given element.
-
-