In this seminar, we will discuss private federated learning. We will firstly provide new optimization error bounds for differential private federated learning with Laplacian Smoothing (DP-Fed-LS) and heterogeneous data. The error bounds help us better understand the influence of errors introduced by differential privacy, heterogeneity of data and variance of stochastic gradient descent over the convergence of DP-Fed-LS. For another, we will also explore how to push the limit of private federated learning by improving current gradient attack. Experiment shows that our proposed new attack can recover training data with high quality while the targeted model is untrained and when the batch size is small. Attacks on more realistic settings are to be discussed.

29 Apr 2021
10:15am - 11:15am
Where
https://hkust.zoom.us/j/99997376210 (Passcode: 214192)
Speakers/Performers
Mr. Zhicong LIANG
Organizer(S)
Department of Mathematics
Contact/Enquiries
Payment Details
Audience
Alumni, Faculty and staff, PG students, UG students
Language(s)
English
Other Events
6 Jan 2026
Seminar, Lecture, Talk
IAS / School of Science Joint Lecture - Innovations in Organo Rare-Earth and Titanium Chemistry: From Self-Healing Polymers to N2 Activation
Abstract In this lecture, the speaker will introduce their recent studies on the development of innovative organometallic complexes and catalysts aimed at realizing unprecedented chem...
5 Dec 2025
Seminar, Lecture, Talk
IAS / School of Science Joint Lecture - Human B Cell Receptor-Epitope Selection for Pan-Sarbecovirus Neutralization
Abstract The induction of broadly neutralizing antibodies (bnAbs) against viruses requires the specific activation of human B cell receptors (BCRs) by viral epitopes. Following BCR activation, ...