Effective and Efficient Methods for Constrained Stochastic and Derivative-free Optimization
Shi, Jiahao
2025
Abstract
In this thesis, we develop, analyze, and implement effective and efficient methods for constrained stochastic and derivative-free optimization. For each method, we prove the convergence result under general assumptions. We also conduct numerical experiments to demonstrate the efficiency and efficacy of our proposed methods. In Chapter 2, we develop a model-based derivative-free optimization method. A key component of variational quantum algorithms (VQAs) is the choice of a classical optimizer to update the ansatz parameterization. Quantum algorithms, however, are destined to run on noisy hardware with limited fidelity, leading to objective function evaluations–such as those in the quantum approximate optimization algorithm (QAOA) or the variational quantum eigensolver (VQE)–that are compromised by both stochastic errors and hardware-induced noise. To address these challenges, we adapt recent developments from the noise-aware numerical optimization literature to these commonly used derivative-free model-based methods. We introduce the key defining characteristics of these novel noise-aware derivative-free model-based methods that separate them from standard model-based methods. We study an implementation of such noise-aware derivative-free model-based methods and compare its performance on demonstrative VQA simulations to classical solvers packaged in scikit-quant. Moreover, we propose a series of sequential quadratic programming (SQP) methods tailored for stochastic equality constrained optimization. In Chapter 3, we develop a stochastic method for solving equality constrained problems that utilizes predictive variance reduction within the SQP paradigm, and prove that a measure of first-order stationarity converges to zero in expectation under reasonable assumptions, with practical performance demonstrated on constrained binary classification problems. In Chapter 4, we propose a line search SQP method for general nonlinear equality constrained optimization that employs a modified line search strategy using second-order information to mitigate the Maratos effect, thereby achieving global convergence and local superlinear convergence. We further extend this approach to settings where the objective function is stochastic or represented as a finite sum, and design a practical inexact matrix-free variant and demonstrate the numerical performance empirically. Finally, in Chapter 5, we propose, analyze, and implement an SQP algorithm for minimizing a nonlinear smooth function subject to smooth equality constraints with noisy function and derivative estimates, addressing the challenges posed by bounded noise in the objective and constraint functions as well as potential rank-deficiency in the constraint Jacobians. This algorithm employs a decomposition approach to compute inexact search directions and utilizes adaptive step size selection schemes, which guarantee global convergence of infeasible stationarity error to a neighborhood of zero that is proportional to the noise level.Deep Blue DOI
Subjects
Nonlinear Optimization Stochastic Optimization Constrained Optimization Derivative-Free Optimization Machine Learning
Types
Thesis
Metadata
Show full item recordCollections
Showing items related by title, author, creator and subject.
-
Bartoli, Nathalie; Lefebvre, Thierry; Dubreuil, Sylvain; Olivanti, Romain; Bons, Nicolas; Martins, Joaquim; Bouhlel, Mohamed-Amine; Morlier, Joseph (American Institute of Aeronautics and Astronautics, 2017)
-
Taylor, John E.; Rossow, Mark P. (Elsevier, 1977)
-
Kolmanovsky, Ilya V.; Sun, Jing; Sivashankar, Shiva N. (Blackwell Publishing Ltd, 2006-09)
Remediation of Harmful Language
The University of Michigan Library aims to describe its collections in a way that respects the people and communities who create, use, and are represented in them. We encourage you to Contact Us anonymously if you encounter harmful or problematic language in catalog records or finding aids. More information about our policies and practices is available at Remediation of Harmful Language.
Accessibility
If you are unable to use this file in its current format, please select the Contact Us link and we can modify it to make it more accessible to you.