User-friendly wrapper of the function GParetoptim. Generates initial DOEs and kriging models (objects of class km), and executes nsteps iterations of multiobjective EGO methods.
easyGParetoptim( fn,..., cheapfn =NULL, budget, lower, upper, par =NULL, value =NULL, noise.var =NULL, control = list(method ="SMS", trace =1, inneroptim ="pso", maxit =100, seed =42), ncores =1)
Arguments
fn: the multi-objective function to be minimized (vectorial output), found by a call to match.fun, see details,
...: additional parameters to be given to the objective fn.
cheapfn: optional additional fast-to-evaluate objective function (handled next with class fastfun), which does not need a kriging model, handled by a call to match.fun,
budget: total number of calls to the objective function,
lower: vector of lower bounds for the variables to be optimized over,
upper: vector of upper bounds for the variables to be optimized over,
par: initial design of experiments. If not provided, par is taken as a maximin LHD with budget/3 points,
value: initial set of objective observations fn(par). Computed if not provided. Not that value may NOT contain any cheapfn value,
noise.var: optional noise variance, for noisy objectives fn. If not NULL, either a scalar (constant noise, identical for all objectives), a vector (constant noise, different for each objective) or a function (type closure) with vectorial output (variable noise, different for each objective). Alternatively, set noise.var="given_by_fn", see details.
control: an optional list of control parameters. See "Details",
ncores: number of CPU available (> 1 makes mean parallel TRUE). Only used with discrete optimization for now.
Returns
A list with components:
par: all the non-dominated points found,
value: the matrix of objective values at the points given in par,
history: a list containing all the points visited by the algorithm (X) and their corresponding objectives (y),
model: a list of objects of class km, corresponding to the last kriging models fitted.
Note that in the case of noisy problems, value and history$y.denoised are denoised values. The original observations are available in the slot history$y.
Details
Does not require specific knowledge on kriging models (objects of class km).
The problem considered is of the form: minf(x)=f1(x),...,fp(x). The control argument is a list that can supply any of the following optional components:
method: choice of multiobjective improvement function: "SMS", "EHI", "EMI" or "SUR" (see crit_SMS, crit_EHI, crit_EMI, crit_SUR),
trace: if positive, tracing information on the progress of the optimization is produced (1 (default) for general progress, \>1 for more details, e.g., warnings from genoud),
inneroptim: choice of the inner optimization algorithm: "genoud", "pso" or "random" (see genoud and psoptim),
maxit: maximum number of iterations of the inner loop,
seed: to fix the random variable generator,
refPoint: reference point for hypervolume computations (for "SMS" and "EHI" methods),
extendper: if no reference point refPoint is provided, for each objective it is fixed to the maximum over the Pareto front plus extendper times the range. Default value to 0.2, corresponding to 1.1 for a scaled objective with a Pareto front in [0,1]^n.obj.
If noise.var="given_by_fn", fn must return a list of two vectors, the first being the objective functions and the second the corresponding noise variances. See examples in GParetoptim.
For additional details or other possible arguments, see GParetoptim.
Display of results and various post-processings are available with plotGPareto.
M. T. Emmerich, A. H. Deutz, J. W. Klinkenberg (2011), Hypervolume-based expected improvement: Monotonicity properties and exact computation, Evolutionary Computation (CEC), 2147-2154.
V. Picheny (2015), Multiobjective optimization using Gaussian process emulators via stepwise uncertainty reduction, Statistics and Computing, 25(6), 1265-1280.
T. Wagner, M. Emmerich, A. Deutz, W. Ponweiser (2010), On expected-improvement criteria for model-based multi-objective optimization. Parallel Problem Solving from Nature, 718-727, Springer, Berlin.
J. D. Svenson (2011), Computer Experiments: Multiobjective Optimization and Sensitivity Analysis, Ohio State university, PhD thesis.
M. Binois, V. Picheny (2019), GPareto: An R Package for Gaussian-Process-Based Multi-Objective Optimization and Analysis, Journal of Statistical Software, 89(8), 1-30, tools:::Rd_expr_doi("10.18637/jss.v089.i08") .