Audet, CharlesDennis, J.E. Jr.2018-06-182018-06-182004-01Audet, Charles and Dennis, J.E. Jr.. "Mesh Adaptive Direct Search Algorithms for Constrained Optimization." (2004) <a href="https://hdl.handle.net/1911/102015">https://hdl.handle.net/1911/102015</a>.https://hdl.handle.net/1911/102015This paper introduces the Mesh Adaptive Direct Search (MADS) class of algorithms for nonlinear optimization. MADS extends the Generalized Pattern Search (GPS) class by allowing local exploration, called polling, in a dense set of directions in the space of optimization variables. This means that under certain hypotheses, including a weak constraint qualification due to Rockafellar, MADS can treat constraints by the extreme barrier approach of setting the objective to infinity for infeasible points and treating the problem as unconstrained. The main GPS convergence result is to identify limit points where the Clarke generalized derivatives are nonnegative in a finite set of directions, called refining directions. Although in the unconstrained case, nonnegative combinations of these directions spans the whole space, the fact that there can only be finitely many GPS refining directions limits rigorous justification of the barrier approach to finitely many constraints for GPS. The MADS class of algorithms extend this result; the set of refining directions may even be dense in Rn, although we give an example where it is not. We present an implementable instance of MADS, and we illustrate and compare it with GPS on some test problems. We also illustrate the limitation of our results with examples.27 ppengMesh Adaptive Direct Search Algorithms for Constrained OptimizationTechnical reportTR04-02