Skip to yearly menu bar Skip to main content


Virtual presentation / poster accept

Test-Time Robust Personalization for Federated Learning

Liangze Jiang · Tao Lin

Keywords: [ Deep Learning and representational learning ] [ Personalized Federated Learning ] [ Test-time Robustness ] [ federated learning ]


Abstract:

Federated Learning (FL) is a machine learning paradigm where many clients collaboratively learn a shared global model with decentralized training data. Personalization on FL models additionally adapts the global model to different clients, achieving promising results on consistent local training & test distributions. However, for real-world personalized FL applications, it is crucial to go one step further: robustifying FL models under the evolving local test set during deployment, where various types of distribution shifts can arise. In this work, we identify the pitfalls of existing works under test-time distribution shifts and propose Federated Test-time Head Ensemble plus tuning (FedTHE+), which personalizes FL models with robustness to various test-time distribution shifts. We illustrate the advancement of FedTHE+ (and its degraded computationally efficient variant FedTHE) over strong competitors, for training various neural architectures (CNN, ResNet, and Transformer) on CIFAR10 and ImageNet and evaluating on diverse test distributions. Along with this, we build a benchmark for assessing the performance and robustness of personalized FL methods during deployment. Code: \url{https://github.com/LINs-lab/FedTHE}.

Chat is not available.