public class NonLinearConjugateGradientOptimizer extends AbstractScalarDifferentiableOptimizer
This class supports both the Fletcher-Reeves and the Polak-Ribière update formulas for the conjugate search directions. It also supports optional preconditioning.
|Constructor and Description|
Simple constructor with default settings.
|Modifier and Type||Method and Description|
Set the initial step used to bracket the optimum in line search.
Set the solver to use during line search.
Set the preconditioner.
getConvergenceChecker, getEvaluations, getGradientEvaluations, getIterations, getMaxEvaluations, getMaxIterations, optimize, setConvergenceChecker, setMaxEvaluations, setMaxIterations
public NonLinearConjugateGradientOptimizer(ConjugateGradientFormula updateFormula)
The convergence check is set to a
and the maximal number of iterations is set to
public void setPreconditioner(Preconditioner preconditioner)
preconditioner- preconditioner to use for next optimization, may be null to remove an already registered preconditioner
public void setLineSearchSolver(UnivariateRealSolver lineSearchSolver)
lineSearchSolver- solver to use during line search, may be null to remove an already registered solver and fall back to the default
public void setInitialStep(double initialStep)
The initial step is a factor with respect to the search direction, which itself is roughly related to the gradient of the function
initialStep- initial step used to bracket the optimum in line search, if a non-positive value is used, the initial step is reset to its default value of 1.0
"Copyright © 2010 - 2020 Adobe Systems Incorporated. All Rights Reserved"