Package de.jungblut.math.minimize
Class GradientDescent.GradientDescentBuilder
- java.lang.Object
-
- de.jungblut.math.minimize.GradientDescent.GradientDescentBuilder
-
- Enclosing class:
- GradientDescent
public static class GradientDescent.GradientDescentBuilder extends java.lang.Object
-
-
Method Summary
All Methods Static Methods Instance Methods Concrete Methods Modifier and Type Method Description GradientDescent.GradientDescentBuilderannealingAfter(int iteration)Sets a simple annealing (alpha / (1+current_iteration / phi)) where phi is the given parameter here.GradientDescent.GradientDescentBuilderboldDriver()BoldDriver will change the learning rate over time by observing the cost of the costfunction.GradientDescent.GradientDescentBuilderboldDriver(double increasedCostPercentage, double decreasedCostPercentage)BoldDriver will change the learning rate over time by observing the cost of the costfunction.GradientDescent.GradientDescentBuilderbreakOnDifference(double delta)Breaks minimization process when the given delta in costs have been archieved.GradientDescent.GradientDescentBuilderbreakOnDivergence()If called, this breaks when the gradient descent minimizer starts to diverge (costs are growing).GradientDescentbuild()static GradientDescent.GradientDescentBuildercreate(double alpha)Creates a new builder.GradientDescent.GradientDescentBuildermomentum(double momentum)Add momentum to this gradient descent minimizer.
-
-
-
Method Detail
-
build
public GradientDescent build()
-
momentum
public GradientDescent.GradientDescentBuilder momentum(double momentum)
Add momentum to this gradient descent minimizer.- Parameters:
momentum- the momentum to use. Between 0 and 1.- Returns:
- the builder again.
-
boldDriver
public GradientDescent.GradientDescentBuilder boldDriver()
BoldDriver will change the learning rate over time by observing the cost of the costfunction. If the cost decreases, it will increase the learning rate by 5%. If the cost increases it will cut the learning rate in half.- Returns:
- the builder again.
-
boldDriver
public GradientDescent.GradientDescentBuilder boldDriver(double increasedCostPercentage, double decreasedCostPercentage)
BoldDriver will change the learning rate over time by observing the cost of the costfunction. If the cost decreases, it will increase the learning rate (typically by 5%). If the cost increases it will (typically) cut the learning rate in half.- Parameters:
increasedCostPercentage- the percentage of the learning rate that will be used when cost increases.decreasedCostPercentage- the percentage of the learning rate that will be used when cost decreases.- Returns:
- the builder again.
-
breakOnDivergence
public GradientDescent.GradientDescentBuilder breakOnDivergence()
If called, this breaks when the gradient descent minimizer starts to diverge (costs are growing).- Returns:
- the builder again.
-
breakOnDifference
public GradientDescent.GradientDescentBuilder breakOnDifference(double delta)
Breaks minimization process when the given delta in costs have been archieved. Usually a quite low value of 1e-4 to 1e-8.- Parameters:
delta- the delta to break in difference between two costs.- Returns:
- the builder again.
-
annealingAfter
public GradientDescent.GradientDescentBuilder annealingAfter(int iteration)
Sets a simple annealing (alpha / (1+current_iteration / phi)) where phi is the given parameter here. This will gradually lower the global learning rate after the given amount of iterations.- Parameters:
iteration- the iteration to start annealing.- Returns:
- the builder again.
-
create
public static GradientDescent.GradientDescentBuilder create(double alpha)
Creates a new builder.- Parameters:
alpha- the learning rate to set.- Returns:
- a new builder.
-
-