Class MultilayerPerceptron

    • Field Detail

      • SEED

        public static long SEED
    • Method Detail

      • predict

        public de.jungblut.math.DoubleVector predict​(de.jungblut.math.DoubleVector xi)
        Predicts the outcome of the given input by doing a forward pass.
        Returns:
        the vector that contains an indicator at the index of the class. Usually zero or 1, in some cases it is a probability or activation value.
      • predict

        public de.jungblut.math.DoubleVector predict​(de.jungblut.math.DoubleVector xi,
                                                     double threshold)
        Predicts the outcome of the given input by doing a forward pass. Used for binary classification by a threshold. Everything above threshold will be considered as 1, the other case as 0.
      • train

        public void train​(de.jungblut.math.DoubleVector[] features,
                          de.jungblut.math.DoubleVector[] outcome)
        Description copied from interface: Classifier
        Trains this classifier with the given features and the outcome.
        Specified by:
        train in interface Classifier
        Overrides:
        train in class AbstractClassifier
        outcome - the outcome must have classes labeled as doubles. E.G. in the binary case you have a single element and decide between 0d and 1d. In higher dimensional cases you have each of these single elements mapped to a dimension.
      • train

        public final double train​(de.jungblut.math.DoubleVector[] features,
                                  de.jungblut.math.DoubleVector[] outcome,
                                  Minimizer minimizer,
                                  int maxIterations,
                                  double lambda,
                                  boolean verbose)
        Full backpropagation training method. It performs weight finding by using a minimizer. Note that it only guarantees to find a global minimum solution in case of linear or convex problems (zero / one hidden layer), of course this is also dependend on the concrete minimizer implementation. If you have more than a single hidden layer, then it will usually trap into a local minimum. It supplies a vector so training can be resumed from a good starting point.
        Parameters:
        features - the training examples.
        outcome - the outcomes for the training examples.
        minimizer - the minimizer to use to train the neural network.
        maxIterations - the number of maximum iterations to train.
        lambda - the given regularization parameter.
        verbose - output to console with the last given errors.
        theta - initial spot to start the minimizations.
        Returns:
        the cost of the training.
      • getFoldedThetaVector

        public de.jungblut.math.DoubleVector getFoldedThetaVector()
        Returns:
        the folded theta vector, seeded by the current weight matrices.
      • getLayers

        public int[] getLayers()
      • deserialize

        public static MultilayerPerceptron deserialize​(java.io.DataInput in)
                                                throws java.io.IOException
        Deserializes a new neural network from the given input stream. Note that "in" will not be closed by this method.
        Throws:
        java.io.IOException
      • serialize

        public static void serialize​(MultilayerPerceptron model,
                                     java.io.DataOutput out)
                              throws java.io.IOException
        Serializes this network at its current state to a binary file. Note that "out" will not be closed in this method.
        Throws:
        java.io.IOException