public class AdaptiveFTRLRegularizer extends GradientDescentUpdater
| Constructor and Description |
|---|
AdaptiveFTRLRegularizer(double beta,
double l1,
double l2)
Creates a new AdaptiveFTRLRegularizer.
|
| Modifier and Type | Method and Description |
|---|---|
CostWeightTuple |
computeNewWeights(de.jungblut.math.DoubleVector theta,
de.jungblut.math.DoubleVector gradient,
double learningRate,
long iteration,
double lambda,
double cost)
Simplistic gradient descent without regularization.
|
computeGradientpublic AdaptiveFTRLRegularizer(double beta,
double l1,
double l2)
beta - the smoothing parameter for the learning rate.l1 - the l1 regularization.l2 - the l2 regularization.public CostWeightTuple computeNewWeights(de.jungblut.math.DoubleVector theta, de.jungblut.math.DoubleVector gradient, double learningRate, long iteration, double lambda, double cost)
GradientDescentUpdatercomputeNewWeights in interface WeightUpdatercomputeNewWeights in class GradientDescentUpdatertheta - the old weights.gradient - the gradient.learningRate - the learning rate.iteration - the number of the current iteration.lambda - the regularization parameter.cost - the computed cost for this gradient update.Copyright © 2015. All rights reserved.