Randomized subspace methods for high-dimensional model-based derivative-free optimization (35mins)

Abstract

Derivative-free optimization (DFO) is the mathematical study of optimization algorithms that do not use derivatives.  Model-based DFO methods are widely used in practice but are known to struggle in high dimensions.  This talk provides a brief overview of recent research, covering both unconstrained and convex-constrained optimization problems, that addresses this issue by searching for decreases within randomly sampled low-dimensional subspaces.  In particular, we examine the requirements for model accuracy and subspace quality in these methods, and compare their convergence guarantees and complexity bounds.  This talk concludes with a discussion of some promising future directions in this area.

Date
Dec 5, 2025
Event
University of Melbourne
Location
University of Melbourne
Melbourne, VIC
Yiwen Chen
Yiwen Chen
PhD student in Mathematics

My research interests include derivative-free optimization, numerical optimization, and discrete geometry.