Skip to yearly menu bar Skip to main content


Poster
in
Workshop: The 4th Workshop on practical ML for Developing Countries: learning under limited/low resource settings

JumpStyle: A Framework for Data-Efficient Online Adaptation

Aakash Singh · Manogna Sreenivas · Soma Biswas


Abstract:

Research in deep learning is restrictive in developing countries due to a lack of computational resources, quality training data, and expert knowledge, which negatively impacts the performance of deep networks. Moreover, these models are prone to suffer from distribution shift during testing. To address these challenges, this paper presents a novel approach for fine-tuning deep networks in a Domain Generalization setting. The proposed framework, JumpStyle, comprises two key components: (1) an innovative initialization technique that jumpstarts the adaptation process, and (2) the use of style-aware augmentation with pseudo-labeling, in conjunction with a simple and effective test-time adaptation baseline named Tent. Importantly, JumpStyle only requires access to a pre-trained model and is not limited by the training method. The effectiveness of this approach is extensively evaluated through experiments.

Chat is not available.