| Package | Description |
|---|---|
| de.jungblut.math.activation |
| Modifier and Type | Class and Description |
|---|---|
class |
ElliotActivationFunction
Implementation of the elliot activation function.
|
class |
LinearActivationFunction
Linear activation function.
|
class |
LogActivationFunction
Log activation function, guarded against NaN and infinity edge cases.
|
class |
ReluActivationFunction
Rectified linear units implementation.
|
class |
SigmoidActivationFunction
Implementation of the sigmoid function.
|
class |
SoftMaxActivationFunction
Softmax activation that only works on vectors, because it needs to sum and
divide the probabilities.
|
class |
SoftplusReluActivationFunction
Smoothed approximation to a
ReluActivationFunction. |
class |
StepActivationFunction
Classic perceptron-like step function.
|
class |
TanhActivationFunction
Implementation of the Tanh activation based on
FastMath. |
Copyright © 2016. All rights reserved.