Skip to yearly menu bar Skip to main content


Poster
in
Workshop: The 4th Workshop on practical ML for Developing Countries: learning under limited/low resource settings

Intermediate Task Fine-tuning of Sequence-Sequence Language Models with Auxiliary Domain Parallel Data for Low-resource NMT

Shravan Nayak · Sarubi Thillainathan · Surangika Ranathunga · Rikki Hung · Yining Wang · Jonah Mackey · Andrew Ho · Anthony Rinaldi · En-Shiun Annie Lee


Abstract:

NMT systems trained on Pre-trained Multilingual Sequence-Sequence (PMSS) models flounder when sufficient amounts of parallel data is not available for fine-tuning. This specifically holds for languages missing/under-represented in these models. The problem gets aggravated when the data comes from different domains. In this paper, we show that intermediate-task fine-tuning (ITFT) of PMSS models is extremely beneficial for domain-specific NMT, especially when target domain data is limited/unavailable and the considered languages are missing/under-represented in the PMSS model. We quantify the domain-specific results variations using a domain-divergence test, and show that ITFT can mitigate the impact of domain divergence to some extent.

Chat is not available.