class |
ElliotActivationFunction |
Implementation of the elliot activation function.
|
class |
LinearActivationFunction |
Linear activation function.
|
class |
LogActivationFunction |
Log activation function, guarded against NaN and infinity edge cases.
|
class |
ReluActivationFunction |
Rectified linear units implementation.
|
class |
SigmoidActivationFunction |
Implementation of the sigmoid function.
|
class |
SoftMaxActivationFunction |
Softmax activation that only works on vectors, because it needs to sum and
divide the probabilities.
|
class |
SoftplusReluActivationFunction |
|
class |
StepActivationFunction |
Classic perceptron-like step function.
|
class |
TanhActivationFunction |
Implementation of the Tanh activation based on FastMath.
|