public class CMAESOptimizer extends MultivariateOptimizer
An implementation of the active Covariance Matrix Adaptation Evolution Strategy (CMA-ES) for non-linear, non-convex, non-smooth, global function minimization. The CMA-Evolution Strategy (CMA-ES) is a reliable stochastic optimization method which should be applied if derivative-based methods, e.g. quasi-Newton BFGS or conjugate gradient, fail due to a rugged search landscape (e.g. noise, local optima, outlier, etc.) of the objective function. Like a quasi-Newton method, the CMA-ES learns and applies a variable metric on the underlying search space. Unlike a quasi-Newton method, the CMA-ES neither estimates nor uses gradients, making it considerably more reliable in terms of finding a good, or even close to optimal, solution.
In general, on smooth objective functions the CMA-ES is roughly ten times slower than BFGS (counting objective function evaluations, no gradients provided). For up to variables also the derivative-free simplex direct search method (Nelder and Mead) can be faster, but it is far less reliable than CMA-ES.
The CMA-ES is particularly well suited for non-separable and/or badly conditioned problems. To observe the advantage of CMA compared to a conventional evolution strategy, it will usually take about function evaluations. On difficult problems the complete optimization (a single run) is expected to take roughly between and function evaluations.
This implementation is translated and adapted from the Matlab version of the CMA-ES algorithm as implemented in
module cmaes.m
version 3.51.
Modifier and Type | Class and Description |
---|---|
static class |
CMAESOptimizer.PopulationSize
Population size.
|
static class |
CMAESOptimizer.Sigma
Input sigma values.
|
evaluations
Constructor and Description |
---|
CMAESOptimizer(int maxIterationsIn,
double stopFitnessIn,
boolean isActiveCMAIn,
int diagonalOnlyIn,
int checkFeasableCountIn,
RandomGenerator randomIn,
boolean generateStatisticsIn,
ConvergenceChecker<PointValuePair> checker) |
Modifier and Type | Method and Description |
---|---|
protected PointValuePair |
doOptimize()
Performs the bulk of the optimization algorithm.
|
List<RealMatrix> |
getStatisticsDHistory() |
List<Double> |
getStatisticsFitnessHistory() |
List<RealMatrix> |
getStatisticsMeanHistory() |
List<Double> |
getStatisticsSigmaHistory() |
PointValuePair |
optimize(OptimizationData... optData)
Stores data and performs the optimization.
|
computeObjectiveValue, getGoalType
getLowerBound, getStartPoint, getUpperBound
getConvergenceChecker, getEvaluations, getIterations, getMaxEvaluations, getMaxIterations, incrementEvaluationCount, incrementIterationCount
public CMAESOptimizer(int maxIterationsIn, double stopFitnessIn, boolean isActiveCMAIn, int diagonalOnlyIn, int checkFeasableCountIn, RandomGenerator randomIn, boolean generateStatisticsIn, ConvergenceChecker<PointValuePair> checker)
maxIterationsIn
- Maximal number of iterations.stopFitnessIn
- Whether to stop if objective function value is smaller than stopFitness
.isActiveCMAIn
- Chooses the covariance matrix update method.diagonalOnlyIn
- Number of initial iterations, where the covariance matrix
remains diagonal.checkFeasableCountIn
- Determines how often new random objective variables are
generated in case they are out of bounds.randomIn
- Random generator.generateStatisticsIn
- Whether statistic data is collected.checker
- Convergence checker.public List<Double> getStatisticsSigmaHistory()
public List<RealMatrix> getStatisticsMeanHistory()
public List<Double> getStatisticsFitnessHistory()
public List<RealMatrix> getStatisticsDHistory()
public PointValuePair optimize(OptimizationData... optData)
optimize
in class MultivariateOptimizer
optData
- Optimization data. The following data will be looked for:
TooManyEvaluationsException
- if the maximal number of
evaluations is exceeded.DimensionMismatchException
- if the initial guess, target, and weight
arguments have inconsistent dimensions.protected PointValuePair doOptimize()
doOptimize
in class BaseOptimizer<PointValuePair>
Copyright © 2019 CNES. All rights reserved.