If you use PDFO, please cite the following paper. Note that PDFO
contains improvements and bug fixes that do not exist in Powell's
original code. See Subsections 4.3–4.5 of the paper below for details.
[1]
T. M. Ragonneau and Z. Zhang, PDFO: a cross-platform package for
Powell's derivative-free optimization solvers, arXiv:2302.13246, 2023.
@misc{Ragonneau_Zhang_2023,
title = {{PDFO}: a cross-platform package for {P}owell's derivative-free optimization solvers},
author = {Ragonneau, T. M. and Zhang, Z.},
howpublished = {arXiv:2302.13246},
year = 2023,
}
In addition, references for Powell's methods are as follows.
[2]
M. J. D. Powell, A direct search optimization method that models the objective and constraint functions by linear interpolation, In Advances in Optimization and Numerical Analysis, eds. S. Gomez and J. P. Hennart, pages 51–67, Springer Verlag, Dordrecht, Netherlands, 1994
[3]
M. J. D. Powell, UOBYQA: unconstrained optimization by quadratic approximation, Math. Program., 92(B):555–582, 2002
[4]
M. J. D. Powell, Least Frobenius norm updating of quadratic models that satisfy interpolation conditions. Math. Program., 100:183–215, 2004
[5]
M. J. D. Powell, On the use of quadratic models in unconstrained minimization without derivatives, Optim. Methods Softw., 19:399–411, 2004
[6]
M. J. D. Powell, On updating the inverse of a KKT matrix, in Numerical Linear Algebra and Optimization, ed. Ya-xiang Yuan, Science Press (Beijing), pp. 56–78, 2004
[7]
M. J. D. Powell, The NEWUOA software for unconstrained optimization without derivatives, In Large-Scale Nonlinear Optimization, eds. G. Di Pillo and M. Roma, pages 255–297, Springer, New York, US, 2006
[8]
M. J. D. Powell, A view of algorithms for optimization without derivatives, Technical Report DAMTP 2007/NA63, Department of Applied Mathematics and Theoretical Physics, Cambridge University, Cambridge, UK, 2007
[9]
M. J. D. Powell, Developments of NEWUOA for minimization without derivatives, IMA J. Numer. Anal., 28:649–664, 2008
[10]
M. J. D. Powell, The BOBYQA algorithm for bound constrained optimization without derivatives, Technical Report DAMTP 2009/NA06, Department of Applied Mathematics and Theoretical Physics, Cambridge University, Cambridge, UK, 2009
[11]
M. J. D. Powell, On fast trust region methods for quadratic models with linear constraints, Math. Program. Comput., 7:237–267, 2015
[12]
T. M. Ragonneau. Model-Based Derivative-Free Optimization Methods and Software, Chapter 3. PhD thesis, The Hong Kong Polytechnic University, Hong Kong SAR, China, 2022.
[13]
Z. Zhang, PRIMA: Reference Implementation for Powell's Methods
with Modernization and Amelioration, available at
http://www.libprima.net, 2023.
Remarks
-
A key technique underlying the success of NEWUOA, BOBYQA, and LINCOA is the least Frobenius norm updating of quadratic models elaborated in [4] and [5].
The idea comes from the least change update for quasi-Newton methods, a vast research area initiated by the DFP algorithm, where P stands for Powell.
-
The least Frobenius norm updating is a quadratic programming problem, whose constraints correspond to the interpolation conditions.
At each iteration of Powell's algorithms, only one of the constraints is different from the previous iteration.
To solve this problem efficiently and stably, Powell designed a procedure to update the inverse of its KKT matrix along the iterations.
Such a procedure is detailed in [6], and it is indispensable for the remarkable numerical stability of NEWUOA, BOBYQA, and LINCOA.
-
LINCOA seeks the least value of a nonlinear function subject to linear inequality constraints without using derivatives of the objective function.
Professor Powell did not publish a paper to introduce the algorithm.
The paper [11] discusses how LINCOA solves its trust-region subproblems.
-
Different from PDFO, which provides interfaces for Powell's code,
[13] provides the modernized reference implementation for Powell's methods.