SEMINAR

Recent Progress on Grokking and Probabilistic Federated Learning

Speaker

Thang D. Bui

Working
Australian National University
Timeline
Fri, Jan 26 2024 - 10:00 am (GMT + 7)
About Speaker

Thang Bui is a lecturer in Machine Learning at the School of Computing, Australian National University. He is broadly interested in machine learning and statistics with a particular focus on neural networks, probabilistic models, approximate Bayesian inference, and sequential decision making under uncertainty.

Abstract

This talk is divided into two parts. The first part delves into recent empirical observations on the grokking phenomenon, where neural networks achieve perfect or near-perfect accuracy on the validation set long after similar performance has been attained on the training set. We demonstrate that grokking is not limited to neural networks but occurs in other settings such as Gaussian process (GP) classification, GP regression and linear regression. Our hypothesis suggests that the phenomenon is governed by the accessibility of certain regions in the error and complexity landscapes. In the second part, I will discuss federated training of probabilistic models, specifically Bayesian neural networks and Gaussian processes, employing partitioned variational inference. Additionally, I will highlight some pitfalls of current techniques and discuss potential future directions.

Related seminars

Dr. Tu Vu

Virginia Tech

Efficient Model Development in the Era of Large Language Models
Tue, Nov 5 2024 - 09:30 am (GMT + 7)
Representation Learning with Graph Autoencoders and Applications to Music Recommendation
Fri, Jul 26 2024 - 10:00 am (GMT + 7)

Trieu Trinh

Google Deepmind

AlphaGeometry: Solving IMO Geometry without Human Demonstrations
Fri, Jul 5 2024 - 10:00 am (GMT + 7)

Tat-Jun (TJ) Chin

Adelaide University

Quantum Computing in Computer Vision: A Case Study in Robust Geometric Optimisation
Fri, Jun 7 2024 - 11:00 am (GMT + 7)