Randomized subspace methods for high-dimensional model-based derivative-free optimization (12mins)

Abstract

Derivative-free optimization (DFO) is the mathematical study of optimization algorithms that do not use derivatives. Model-based DFO methods are widely used in practice but are known to struggle in high dimensions. This talk provides a brief overview of recent research that addresses this issue by searching for decreases within randomly sampled low-dimensional subspaces. In particular, we examine the requirements for model accuracy and subspace quality in these methods, and compare their convergence guarantees and complexity bounds. This talk concludes with a discussion of some promising future directions in this area.

Date
Oct 18, 2025
Location
WCOM 2025
Kelowna, BC
Yiwen Chen
Yiwen Chen
PhD student in Mathematics

My research interests center on the theoretical foundations of derivative-free optimization, with a particular emphasis on model accuracy, complexity analysis, and randomized subspace methods for high-dimensional problems. I am also interested in discrete geometry and polytope theory.