Skip to yearly menu bar Skip to main content


Poster

Quick-Tune: Quickly Learning Which Pretrained Model to Finetune and How

Sebastian Pineda Arango · Fabio Ferreira · Arlind Kadra · Frank Hutter · Josif Grabocka

Halle B #180
[ ] [ Project Page ]
Thu 9 May 1:45 a.m. PDT — 3:45 a.m. PDT
 
Oral presentation: Oral 5C
Thu 9 May 1 a.m. PDT — 1:45 a.m. PDT

Abstract:

With the ever-increasing number of pretrained models, machine learning practitioners are continuously faced with which pretrained model to use, and how to finetune it for a new dataset. In this paper, we propose a methodology that jointly searches for the optimal pretrained model and the hyperparameters for finetuning it. Our method transfers knowledge about the performance of many pretrained models with multiple hyperparameter configurations on a series of datasets. To this aim, we evaluated over 20k hyperparameter configurations for finetuning 24 pretrained image classification models on 87 datasets to generate a large-scale meta-dataset. We meta-learn a gray-box performance predictor on the learning curves of this meta-dataset and use it for fast hyperparameter optimization on new datasets. We empirically demonstrate that our resulting approach can quickly select an accurate pretrained model for a new dataset together with its optimal hyperparameters.

Live content is unavailable. Log in and register to view live content