Bayesian aggregation has many good characteristics in both theory and practice, which is proved more stable and flexible than single model selection. However, for large models, the optimization and inference of posterior models are resource-intensive from a practical view. Thus, this work considers a general framework to perform Bayesian aggregation on over-parametrized models, especially for neural networks. In particular, rather than using explicit Gibbs distribution in conventional models, we leverage the samples from Monte Carlo Markov Chain (MCMC) process of Langevin-like dynamics with anisotropic noise and aggregate models by recalibrating training data. With different noise shape, the corresponding posterior has some virtues on over-parametrized setting. Moreover, recalibration techniques can be conducted to helps us to obtain an efficient well-calibrated model at inference time.

5月5日
10:00am - 11:00am
地点
https://hkust.zoom.us/j/92896643876 (Passcode: 014877)
讲者/表演者
Mr. Hanze DONG
主办单位
Department of Mathematics
联系方法
付款详情
对象
Alumni, Faculty and staff, PG students, UG students
语言
英语
其他活动
5月11日
研讨会, 演讲, 讲座
IAS / School of Science Joint Lecture - Regioselective Pyridine C-H-Functionalization and Skeletal Editing
Abstract Pyridines belong to the most abundant heteroarenes in medicinal chemistry and in agrochemical industry. In the lecture, highly regioselective pyridine C-H functionalization through a d...
1月20日
研讨会, 演讲, 讲座
IAS / School of Science Joint Lecture - A Journey to Defect Science and Engineering
Abstract A defect in a material is one of the most important concerns when it comes to modifying and tuning the properties and phenomena of materials. The speaker will review his stud...