| AbstractActivationFunction |
Implements the boiler plate code for applying functions on container classes
like vectors and matrices by applying the function on every element.
|
| ElliotActivationFunction |
Implementation of the elliot activation function.
|
| LinearActivationFunction |
Linear activation function.
|
| LogActivationFunction |
Log activation function, guarded against NaN and infinity edge cases.
|
| ReluActivationFunction |
Rectified linear units implementation.
|
| SigmoidActivationFunction |
Implementation of the sigmoid function.
|
| SoftMaxActivationFunction |
Softmax activation that only works on vectors, because it needs to sum and
divide the probabilities.
|
| SoftplusReluActivationFunction |
|
| StepActivationFunction |
Classic perceptron-like step function.
|
| TanhActivationFunction |
Implementation of the Tanh activation based on FastMath.
|