A wrapper function that attempts to check the objective function, and optionally the gradient and hessian functions, supplied by the user for optimization. It also tries to check the scale of the parameters and bounds to see if they are reasonable.
UTF-8
par: a vector of initial values for the parameters for which optimal values are to be found. Names on the elements of this vector are preserved and used in the results data frame.
fn: A function to be minimized (or maximized), with first argument the vector of parameters over which minimization is to take place. It should return a scalar result.
gr: A function to return (as a vector) the gradient for those methods that can use this information.
hess: A function to return (as a symmetric matrix) the Hessian of the objective function for those methods that can use this information.
lower, upper: Bounds on the variables for methods such as "L-BFGS-B" that can handle box (or bounds) constraints.
control: A list of control parameters. See Details .
...: For optimx further arguments to be passed to fn
and gr; otherwise, further arguments are not used.
Details
Note that arguments after ... must be matched exactly.
While it can be envisaged that a user would have an analytic hessian but not an analytic gradient, we do NOT permit the user to test the hessian in this situation.
Any names given to par will be copied to the vectors passed to fn and gr. Note that no other attributes of par
are copied over. (We have not verified this as at 2009-07-29.)
Returns
A list of the following items:
grOK: TRUE if the analytic gradient and a numerical approximation via numDeriv
agree within the `control$grtesttol` as per the `R` code in function `grchk`. `NULL` if no analytic gradient function is provided.
hessOK: TRUE if the analytic hessian and a numerical approximation via numDeriv::jacobian
agree within the `control$hesstesttol` as per the `R` code in function `hesschk`. NULL if no analytic hessian or no analytic gradient is provided. Note that since an analytic gradient must be available for this test, we use the Jacobian of the gradient to compute the Hessian to avoid one level of differencing, though the `hesschk`
function can work without the gradient.
scalebad: TRUE if the larger of the scaleratios exceeds control$scaletol
scaleratios: A vector of the parameter and bounds scale ratios. See the function code of scalechk for the computation of these values.
References
See the manual pages for optim() and the packages the DESCRIPTION suggests.
Nash JC, and Varadhan R (2011). Unifying Optimization Algorithms to Aid Software System Users: optimx for R., Journal of Statistical Software, 43(9), 1-14., URL http://www.jstatsoft.org/v43/i09/.
Nash JC (2014). On Best Practice Optimization Methods in R., Journal of Statistical Software, 60(2), 1-14., URL http://www.jstatsoft.org/v60/i02/.