We propose a low-rank Gaussian mixture model (LrMM) assuming each matrix-valued observation has a planted low-rank structure. Minimax lower bounds for estimating the underlying low-rank matrix are established allowing a whole range of sample sizes and signal strength. Under a minimal condition on signal strength, referred to as the information-theoretical limit or statistical limit, we prove the minimax optimality of a maximum likelihood estimator which, in general, is computationally infeasible. If the signal is stronger than a certain threshold, called the computational limit, we design a computationally fast estimator based on spectral aggregation and demonstrate its minimax optimality. Moreover, when the signal strength is smaller than the computational limit, we provide evidences based on the low-degree likelihood ratio framework to claim that no polynomial-time algorithm can consistently recover the underlying low-rank matrix. Our results reveal multiple phase transitions in the minimax error rates and the statistical-to-computational gap.

25 Apr 2022
9:30am - 10:30am
Where
https://hkust.zoom.us/j/97582756639 (Passcode: hkust)
Speakers/Performers
Mr. Zhongyuan LYU
Organizer(S)
Department of Mathematics
Contact/Enquiries
Payment Details
Audience
Alumni, Faculty and staff, PG students, UG students
Language(s)
English
Other Events
10 Oct 2025
Seminar, Lecture, Talk
IAS / School of Science Joint Lecture - Use of Large Animal Models to Investigate Brain Diseases
Abstract Genetically modified animal models have been extensively used to investigate the pathogenesis of age-dependent neurodegenerative diseases, such as Alzheimer (AD), Parkinson (PD), Hunti...
14 Jul 2025
Seminar, Lecture, Talk
IAS / School of Science Joint Lecture - Boron Clusters
Abstract The study of carbon clusters led to the discoveries of fullerenes, carbon nanotubes, and graphene. Are there other elements that can form similar nanostructures? To answer this questio...