Derivative-Based Optimization with User-Defined Convergence Criteria
Broyden-Fletcher-Goldfarb-Shanno (BFGS) Optimization
Davidon-Fletcher-Powell (DFP) Quasi-Newton Optimization
Dogleg Trust-Region Optimization
Double Dogleg Trust-Region Optimization
Fast Numerical Gradient
Fast Numerical Hessian
Fast Numerical Jacobian
Gauss-Newton Optimization
Fast Positive Definiteness Check
Limited-memory BFGS with Box Constraints (L-BFGS-B)
Modified Newton-Raphson Optimization
Pure Newton-Raphson Optimization
Provides a derivative-based optimization framework that allows users to combine eight convergence criteria. Unlike standard optimization functions, this package includes a built-in mechanism to verify the positive definiteness of the Hessian matrix at the point of convergence. This additional check helps prevent the solver from falsely identifying non-optimal solutions, such as saddle points, as valid minima.