The remarkable empirical performance of Generative Adversarial Networks (GANs) in generating high-quality samples have attracted enormous attention in the past few years. In this talk, we discuss how well can GANs approximate and learn high-dimensional distributions. We show that deep ReLU neural networks can transform a low-dimensional source distribution to a distribution that is arbitrarily close to a high-dimensional target distribution in Wasserstein distance. The approximation order only depends on the intrinsic dimension of the target distribution. While only finite samples are observed, we prove that GANs are consistent estimators of the data distributions under Wasserstein distance, if the generator and discriminator network architectures are properly chosen. Furthermore, the convergence rates do not depend on the high ambient dimension, but on the lower intrinsic dimension of target distribution, which implies GANs can overcome the curse of dimensionality.

21 Apr 2021
10:00am - 11:00am
Where
https://hkust.zoom.us/j/5906683526 (Passcode: 5956)
Speakers/Performers
Mr. Yunfei YANG
Organizer(S)
Department of Mathematics
Contact/Enquiries
Payment Details
Audience
Alumni, Faculty and staff, PG students, UG students
Language(s)
English
Other Events
15 May 2025
Seminar, Lecture, Talk
IAS / School of Science Joint Lecture - Laser Spectroscopy of Computable Atoms and Molecules with Unprecedented Accuracy
Abstract Precision spectroscopy of the hydrogen atom, a fundamental two-body system, has been instrumental in shaping quantum mechanics. Today, advances in theory and experiment allow us to ext...
24 Mar 2025
Seminar, Lecture, Talk
IAS / School of Science Joint Lecture - Pushing the Limit of Nonlinear Vibrational Spectroscopy for Molecular Surfaces/Interfaces Studies
Abstract Surfaces and interfaces are ubiquitous in Nature. Sum-frequency generation vibrational spectroscopy (SFG-VS) is a powerful surface/interface selective and sub-monolayer sensitive spect...