The remarkable empirical performance of Generative Adversarial Networks (GANs) in generating high-quality samples have attracted enormous attention in the past few years. In this talk, we discuss how well can GANs approximate and learn high-dimensional distributions. We show that deep ReLU neural networks can transform a low-dimensional source distribution to a distribution that is arbitrarily close to a high-dimensional target distribution in Wasserstein distance. The approximation order only depends on the intrinsic dimension of the target distribution. While only finite samples are observed, we prove that GANs are consistent estimators of the data distributions under Wasserstein distance, if the generator and discriminator network architectures are properly chosen. Furthermore, the convergence rates do not depend on the high ambient dimension, but on the lower intrinsic dimension of target distribution, which implies GANs can overcome the curse of dimensionality.

21 Apr 2021
10:00am - 11:00am
Where
https://hkust.zoom.us/j/5906683526 (Passcode: 5956)
Speakers/Performers
Mr. Yunfei YANG
Organizer(S)
Department of Mathematics
Contact/Enquiries
Payment Details
Audience
Alumni, Faculty and staff, PG students, UG students
Language(s)
English
Other Events
11 May 2026
Seminar, Lecture, Talk
IAS / School of Science Joint Lecture - Regioselective Pyridine C-H-Functionalization and Skeletal Editing
Abstract Pyridines belong to the most abundant heteroarenes in medicinal chemistry and in agrochemical industry. In the lecture, highly regioselective pyridine C-H functionalization through a d...
20 Jan 2026
Seminar, Lecture, Talk
IAS / School of Science Joint Lecture - A Journey to Defect Science and Engineering
Abstract A defect in a material is one of the most important concerns when it comes to modifying and tuning the properties and phenomena of materials. The speaker will review his study of defec...