class ConstrainedFitnessAL:
Construct an unconstrained objective function from constraints.
This class constructs an unconstrained "fitness" function (to be minimized) from an objective function and an inequality constraints function (which returns a list of constraint values). An equality constraint h(x) == 0 must be expressed as two inequality constraints like [h(x)  eps, h(x)  eps] with eps >= 0. Nonpositive values <= 0 are considered feasible.
The update
method of the class instance needs to be called after each
iteration. Depending on the setting of which
, update
may call
get_solution(es) which shall return the solution to be used for the
constraints handling update, by default _get_favorite_solution ==
lambda es: es.ask(1, sigma_fac=0)[0]. The additional evaluation of
objective and constraints is avoided by the default which='best',
using the best solution in the current iteration.
find_feasible_first
optimizes to get a feasible solution first, which
only works if no equality constraints are implemented. For this reason
the default is False
.
Minimal example (verbosity set for doctesting):
>>> import cma >>> def constraints(x): # define the constraint ... return [x[0] + 1, x[1]] # shall be <= 0 >>> cfun = cma.ConstrainedFitnessAL(cma.ff.sphere, constraints, ... find_feasible_first=True) >>> es = cma.CMAEvolutionStrategy(3 * [1.1], 0.1, ... {'tolstagnation': 0, 'verbose':9}) # verbosity for doctest only >>> es = es.optimize(cfun, callback=cfun.update) >>> x = es.result.xfavorite
The best x
return value of cma.fmin2
may not be useful, because
the underlying function changes over time. Therefore, we use
es.result.xfavorite
, which is still not guarantied to be a feasible
solution. Alternatively, cfun.best_feas.x
contains the best evaluated
feasible solution. However, this is not necessarily expected to be a
good solution, see below.
>>> assert sum((x  [1, 0, 0])**2) < 1e9, x >>> assert es.countevals < 2200, es.countevals >>> assert cfun.best_feas.f < 10, str(cfun.best_feas) >>> # print(es.countevals, cfun.best_feas.__dict__)
To find a final feasible solution (close to es.result.xfavorite
) we
can use the current CMAEvolutionStrategy
instance es
:
>>> x = cfun.find_feasible(es) # uses es.optimize to find (another) feasible solution >>> assert constraints(x)[0] <= 0, (x, cfun.best_feas.x) >>> assert cfun.best_feas.f < 1 + 2e6, str(cfun.best_feas) >>> assert len(cfun.archives) == 3 >>> assert cma.ConstrainedFitnessAL(cma.ff.sphere, constraints, archives=False).archives == []
Details: The fitness, to be minimized, is changing over time such that the overall minimal value does not indicate the best solution.
The construction is based on the AugmentedLagrangian
class. If, as by
default, self.finding_feasible is False, the fitness equals f(x)
+ sum_i (lam_i x g_i + mu_i x g_i / 2) where g_i = max(g_i(x),
lam_i / mu_i) and lam_i and mu_i are generally positive and
dynamically adapted coefficients. Only lam_i can change the position of
the optimum in the feasible domain (and hence must converge to the
right value).
When self.finding_feasible is True, the fitness equals to sum_i
(g_i > 0) x g_i^2 and omits f + sum_i lam_i g_i altogether.
Whenever a feasible solution is found, the finding_feasible
flag is
reset to False
.
find_feasible(es)
sets finding_feasible = True and uses
es.optimize
to optimize self.__call__
. This works well with
CMAEvolutionStrategy
but may easily fail with solvers that do not
consistently pass over the optimum in search space but approach the
optimum from one side only. This is not advisable if the feasible
domain has zero volume, e.g. when g
models an equality like
g = lambda x: [h(x), h(x)].
An equality constraint, h(x) = 0, cannot be handled like h**2 <= 0,
because the Augmented Lagrangian requires the derivative at h == 0 to
be nonzero. Using abs(h) <= 0 leads to divergence of coefficient mu
and the condition number. The best way is apparently using the two
inequality constraints [h <= 0, h <= 0], which seems to work perfectly
well. The underlying AugmentedLagrangian
class also accepts equality
constraints.
Method  __call__ 
return AL fitness, append f and g values to self.F and self.G . 
Method  __init__ 
constructor with lazy initialization. 
Method  find 
find feasible solution by calling es.optimize(self). 
Method  initialize 
set search space dimension explicitely 
Method  log 
a hack to have something in the cmalogger divers plot. 
Method  reset 
reset dynamic components 
Method  set 
set AL coefficients 
Method  update 
update AL coefficients, may be used as callback to OOOptimizer.optimize . 
Class Variable  archive 
Undocumented 
Instance Variable  archives 
Undocumented 
Instance Variable  best 
Undocumented 
Instance Variable  best 
Undocumented 
Instance Variable  best 
Undocumented 
Instance Variable  constraints 
Undocumented 
Instance Variable  count 
Undocumented 
Instance Variable  count 
Undocumented 
Instance Variable  dimension 
Undocumented 
Instance Variable  F 
Undocumented 
Instance Variable 

Undocumented 
Instance Variable  find 
Undocumented 
Instance Variable  finding 
Undocumented 
Instance Variable  foffset 
Undocumented 
Instance Variable  fun 
Undocumented 
Instance Variable  G 
Undocumented 
Instance Variable  get 
Undocumented 
Instance Variable  logging 
Undocumented 
Instance Variable  omit 
Undocumented 
Instance Variable  which 
Undocumented 
Property  al 
AugmentedLagrangian class instance 
Method  _fg 
f, g values used to update the Augmented Lagrangian coefficients 
Method  _is 
return True if last evaluated solution (or gvals ) was feasible 
Method  _reset 
Undocumented 
Method  _reset 
Undocumented 
Method  _reset 
Undocumented 
Method  _update 
keep track of best solution and best feasible solution 
Instance Variable  _al 
Undocumented 
Instance Variable  _set 
Undocumented 
Property  _best 
return current best f, g, where best is determined by the Augmented Lagrangian 
return AL fitness, append f and g values to self.F
and self.G
.
If self.finding_feasible
, fun(x)
is not called and f = np.nan.
constructor with lazy initialization.
If which in ['mean', 'solution'], get_solution
is called
(with the argument passed to the update
method) to determine the
solution used to update the AL coefficients.
If find_feasible_first
, only the constraints are optimized until
the first (fully) feasible solution is found.
logging
is the iteration gap for logging constraints related
data, in AugmentedLagrangian
. 0 means no logging and negative
values have unspecified behavior.
archives
are the aggregator functions for constraints for
nondominated biobjective archives. By default, the second
objective is max(g_+), sum(g_+) or sum(g_+^2),
respectively. archives=True invokes the same behavior.
archives=False or an empty tupel
prevents maintaining
archives.
find feasible solution by calling es.optimize(self).
Return best ever feasible solution self.best_feas.x
.
See also self.best_feas.info
.
aggregator
, defaulting to self.find_feasible_aggregator
, is the
constraints aggregation function used as objective function to be
minimized. aggregator
takes as input all constraint values and
returns a value <= 0 if and only if the solution is feasible.
Terminate when either (another) feasible solution was found or any
of the termination
keys is matched in es.stop()
.
a hack to have something in the cmalogger divers plot.
Append the sum of positive gvalues and the number of infeasible
constraints, displayed like 10**(number/10) (mapping [0, 10] to [1,
10]) if number < 10, to es.more_to_write
.
update AL coefficients, may be used as callback to OOOptimizer.optimize
.
 TODO: decide what happens when
__call__
was never called:  ignore (as for now) or update based on xfavorite by calling self(xfavorite), assuming that update was called on purpose? When method is not best, it should work without even call self(xfavorite).