Conservative Convex Separable Approximation with Affine Approximation plus Quadratic Penalty
Conservative Convex Separable Approximation with Affine Approximation plus Quadratic Penalty
This is a variant of CCSA ("conservative convex separable approximation") which, instead of constructing local MMA approximations, constructs simple quadratic approximations (or rather, affine approximations plus a quadratic penalty term to stay conservative)
ccsaq( x0, fn, gr =NULL, lower =NULL, upper =NULL, hin =NULL, hinjac =NULL, nl.info =FALSE, control = list(), deprecatedBehavior =TRUE,...)
Arguments
x0: starting point for searching the optimum.
fn: objective function that is to be minimized.
gr: gradient of function fn; will be calculated numerically if not specified.
lower, upper: lower and upper bound constraints.
hin: function defining the inequality constraints, that is hin>=0 for all components.
hinjac: Jacobian of function hin; will be calculated numerically if not specified.
nl.info: logical; shall the original NLopt info been shown.
control: list of options, see nl.opts for help.
deprecatedBehavior: logical; if TRUE (default for now), the old behavior of the Jacobian function is used, where the equality is ≥0
instead of ≤0. This will be reversed in a future release and eventually removed.
...: additional arguments passed to the function.
Returns
List with components: - par: the optimal solution found so far.
value: the function value corresponding to par.
iter: number of (outer) iterations, see maxeval.
convergence: integer code indicating successful completion (> 1) or a possible error number (< 0).
message: character string produced by NLopt and giving additional information.
Note
``Globally convergent'' does not mean that this algorithm converges to the global optimum; it means that it is guaranteed to converge to some local minimum from any feasible starting point.
Examples
## Solve the Hock-Schittkowski problem no. 100 with analytic gradients## See https://apmonitor.com/wiki/uploads/Apps/hs100.apmx0.hs100 <- c(1,2,0,4,0,1,1)fn.hs100 <-function(x){(x[1]-10)^2+5*(x[2]-12)^2+ x[3]^4+3*(x[4]-11)^2+10* x[5]^6+7* x[6]^2+ x[7]^4-4* x[6]* x[7]-10* x[6]-8* x[7]}hin.hs100 <-function(x){c(2* x[1]^2+3* x[2]^4+ x[3]+4* x[4]^2+5* x[5]-127,7* x[1]+3* x[2]+10* x[3]^2+ x[4]- x[5]-282,23* x[1]+ x[2]^2+6* x[6]^2-8* x[7]-196,4* x[1]^2+ x[2]^2-3* x[1]* x[2]+2* x[3]^2+5* x[6]-11* x[7])}gr.hs100 <-function(x){ c(2* x[1]-20,10* x[2]-120,4* x[3]^3,6* x[4]-66,60* x[5]^5,14* x[6]-4* x[7]-10,4* x[7]^3-4* x[6]-8)}hinjac.hs100 <-function(x){ matrix(c(4* x[1],12* x[2]^3,1,8* x[4],5,0,0,7,3,20* x[3],1,-1,0,0,23,2* x[2],0,0,0,12* x[6],-8,8* x[1]-3* x[2],2* x[2]-3* x[1],4* x[3],0,0,5,-11), nrow =4, byrow =TRUE)}## The optimum value of the objective function should be 680.6300573## A suitable parameter vector is roughly## (2.330, 1.9514, -0.4775, 4.3657, -0.6245, 1.0381, 1.5942)# Results with exact JacobianS <- ccsaq(x0.hs100, fn.hs100, gr = gr.hs100, hin = hin.hs100, hinjac = hinjac.hs100, nl.info =TRUE, control = list(xtol_rel =1e-8), deprecatedBehavior =FALSE)# Results without JacobianS <- ccsaq(x0.hs100, fn.hs100, hin = hin.hs100, nl.info =TRUE, control = list(xtol_rel =1e-8), deprecatedBehavior =FALSE)
References
Krister Svanberg, ``A class of globally convergent optimization methods based on conservative convex separable approximations,'' SIAM J. Optim. 12 (2), p. 555-573 (2002).