We consider the problem of minimizing an objective function without any derivative information. Such optimization is called zeroth-order, derivative-free, or black-box optimization. When the problem dimension is large-scale, the existing zeroth-order state-of-the-arts often suffer the curse of dimensionality. In this talk, we explore a novel compressible gradients assumption and propose two new methods, namely ZORO and SCOBO, for high-dimensional zeroth-order optimization. In particular, ZORO uses evaluations of the objective function and SCOBO uses only comparison information between points. Furthermore, we propose a block coordinate descent algorithm, coined ZO-BCD, for ultra-high-dimensional settings. We show the query complexities of ZORO, SCOBO, and ZO-BCD are only logarithmically dependent on the problem dimension. Numerical experiments show that the proposed methods outperform the state-of-the-arts on both synthetic and real datasets.

4月22日
10:30am - 12:00pm
地點
https://hkust.zoom.us/j/99988827320 (Passcode: hkust)
講者/表演者
Prof. HanQin CAI
UCLA
主辦單位
Department of Mathematics
聯絡方法
付款詳情
對象
Alumni, Faculty and staff, PG students, UG students
語言
英語
其他活動
11月22日
研討會, 演講, 講座
IAS / School of Science Joint Lecture - Leveraging Protein Dynamics Memory with Machine Learning to Advance Drug Design: From Antibiotics to Targeted Protein Degradation
Abstract Protein dynamics are fundamental to protein function and encode complex biomolecular mechanisms. Although Markov state models have made it possible to capture long-timescale protein co...
11月8日
研討會, 演講, 講座
IAS / School of Science Joint Lecture - Some Theorems in the Representation Theory of Classical Lie Groups
Abstract After introducing some basic notions in the representation theory of classical Lie groups, the speaker will explain three results in this theory: the multiplicity one theorem for classical...