We consider the problem of minimizing an objective function without any derivative information. Such optimization is called zeroth-order, derivative-free, or black-box optimization. When the problem dimension is large-scale, the existing zeroth-order state-of-the-arts often suffer the curse of dimensionality. In this talk, we explore a novel compressible gradients assumption and propose two new methods, namely ZORO and SCOBO, for high-dimensional zeroth-order optimization. In particular, ZORO uses evaluations of the objective function and SCOBO uses only comparison information between points. Furthermore, we propose a block coordinate descent algorithm, coined ZO-BCD, for ultra-high-dimensional settings. We show the query complexities of ZORO, SCOBO, and ZO-BCD are only logarithmically dependent on the problem dimension. Numerical experiments show that the proposed methods outperform the state-of-the-arts on both synthetic and real datasets.

22 Apr 2021
10:30am - 12:00pm
Where
https://hkust.zoom.us/j/99988827320 (Passcode: hkust)
Speakers/Performers
Prof. HanQin CAI
UCLA
Organizer(S)
Department of Mathematics
Contact/Enquiries
Payment Details
Audience
Alumni, Faculty and staff, PG students, UG students
Language(s)
English
Other Events
6 Jan 2026
Seminar, Lecture, Talk
IAS / School of Science Joint Lecture - Innovations in Organo Rare-Earth and Titanium Chemistry: From Self-Healing Polymers to N2 Activation
Abstract In this lecture, the speaker will introduce their recent studies on the development of innovative organometallic complexes and catalysts aimed at realizing unprecedented chem...
5 Dec 2025
Seminar, Lecture, Talk
IAS / School of Science Joint Lecture - Human B Cell Receptor-Epitope Selection for Pan-Sarbecovirus Neutralization
Abstract The induction of broadly neutralizing antibodies (bnAbs) against viruses requires the specific activation of human B cell receptors (BCRs) by viral epitopes. Following BCR activation, ...