Normalizing flow (NF) has gained popularity over traditional maximum likelihood based methods due to its strong capability to model complex data distributions. However, the standard approach, which maps the observed data to a normal distribution, has difficulty in handling data distributions with multiple relatively isolated modes. To overcome this issue, we propose a new framework based on variational latent representation to improve the practical performance of NF. The idea is to replace the standard normal latent variable with a more general latent representation, jointly learned via Variational Bayes. For example, by taking the latent representation as a discrete sequence, our framework can learn a Transformer model that generates the latent sequence and an NF model that generates continuous data distribution conditioned on the sequence. The resulting method is significantly more powerful than the standard normalization flow approach for generating data distributions with multiple modes. Extensive experiments have shown the advantages of NF with variational latent representation.

29 Apr 2022
10:00am - 11:00am
Where
https://hkust.zoom.us/j/94919234810 (Passcode: 656307)
Speakers/Performers
Mr. Hanze DONG
HKUST
Organizer(S)
Department of Mathematics
Contact/Enquiries
Payment Details
Audience
Alumni, Faculty and staff, PG students, UG students
Language(s)
English
Other Events
10 Oct 2025
Seminar, Lecture, Talk
IAS / School of Science Joint Lecture - Use of Large Animal Models to Investigate Brain Diseases
Abstract Genetically modified animal models have been extensively used to investigate the pathogenesis of age-dependent neurodegenerative diseases, such as Alzheimer (AD), Parkinson (PD), Hunti...
14 Jul 2025
Seminar, Lecture, Talk
IAS / School of Science Joint Lecture - Boron Clusters
Abstract The study of carbon clusters led to the discoveries of fullerenes, carbon nanotubes, and graphene. Are there other elements that can form similar nanostructures? To answer this questio...