Randomized subspace methods for high-dimensional model-based derivative-free optimization

Abstract

Derivative-free optimization (DFO) is the mathematical study of optimization algorithms that do not use derivatives. Model-based DFO methods are widely used in practice but are known to struggle in high dimensions. This talk provides a brief overview of recent research that addresses this issue by searching for decreases within randomly sampled low-dimensional subspaces. In particular, we examine the requirements for model accuracy and subspace quality in these methods, and compare their convergence guarantees and complexity bounds. This talk concludes with a discussion of some promising future directions in this area.

Date
Oct 18, 2025
Location
WCOM 2025
Kelowna, BC
Yiwen Chen
Yiwen Chen
PhD student in Mathematics

My research interests include derivative-free optimization, numerical optimization, and discrete geometry.