Bayesian aggregation has many good characteristics in both theory and practice, which is proved more stable and flexible than single model selection. However, for large models, the optimization and inference of posterior models are resource-intensive from a practical view. Thus, this work considers a general framework to perform Bayesian aggregation on over-parametrized models, especially for neural networks. In particular, rather than using explicit Gibbs distribution in conventional models, we leverage the samples from Monte Carlo Markov Chain (MCMC) process of Langevin-like dynamics with anisotropic noise and aggregate models by recalibrating training data. With different noise shape, the corresponding posterior has some virtues on over-parametrized setting. Moreover, recalibration techniques can be conducted to helps us to obtain an efficient well-calibrated model at inference time.

5月5日
10:00am - 11:00am
地點
https://hkust.zoom.us/j/92896643876 (Passcode: 014877)
講者/表演者
Mr. Hanze DONG
主辦單位
Department of Mathematics
聯絡方法
付款詳情
對象
Alumni, Faculty and staff, PG students, UG students
語言
英語
其他活動
5月11日
研討會, 演講, 講座
IAS / School of Science Joint Lecture - Regioselective Pyridine C-H-Functionalization and Skeletal Editing
Abstract Pyridines belong to the most abundant heteroarenes in medicinal chemistry and in agrochemical industry. In the lecture, highly regioselective pyridine C-H functionalization through a d...
1月20日
研討會, 演講, 講座
IAS / School of Science Joint Lecture - A Journey to Defect Science and Engineering
Abstract A defect in a material is one of the most important concerns when it comes to modifying and tuning the properties and phenomena of materials. The speaker will review his study of defec...