Skip to yearly menu bar Skip to main content


Poster

MiniPLM: Knowledge Distillation for Pre-training Language Models

Yuxian Gu · Hao Zhou · Fandong Meng · Jie Zhou · Minlie Huang
2025 Poster

Abstract

Video

Chat is not available.