Accéder directement au contenu Accéder directement à la navigation
Communication dans un congrès

A Study of Residual Adapters for Multi-Domain Neural Machine Translation

Minh Quang Pham 1, 2 Josep-Maria Crego 1 François Yvon 2 Jean Senellart 1
2 TLP - Traitement du Langage Parlé
LIMSI - Laboratoire d'Informatique pour la Mécanique et les Sciences de l'Ingénieur
Abstract : Domain adaptation is an old and vexing problem for machine translation systems. The most common and successful approach to supervised adaptation is to fine-tune a baseline system with in-domain parallel data. Standard fine-tuning however modifies all the network parameters, which makes this approach computationally costly and prone to overfitting. A recent, lightweight approach, instead augments a baseline model with supplementary (small) adapter layers, keeping the rest of the model unchanged. This has the additional merit to leave the baseline model intact and adaptable to multiple domains. In this paper, we conduct a thorough analysis of the adapter model in the context of a multidomain machine translation task. We contrast multiple implementations of this idea using two language pairs. Our main conclusions are that residual adapters provide a fast and cheap method for supervised multi-domain adaptation; our two variants prove as effective as the original adapter model and open perspective to also make adapted models more robust to label domain errors.
Type de document :
Communication dans un congrès
Liste complète des métadonnées
Contributeur : Limsi Publications <>
Soumis le : jeudi 19 novembre 2020 - 12:00:50
Dernière modification le : lundi 22 février 2021 - 16:21:17
Archivage à long terme le : : samedi 20 février 2021 - 19:05:31


Fichiers éditeurs autorisés sur une archive ouverte


  • HAL Id : hal-03013197, version 1



Minh Quang Pham, Josep-Maria Crego, François Yvon, Jean Senellart. A Study of Residual Adapters for Multi-Domain Neural Machine Translation. Conference on Machine Translation, Nov 2020, Online, United States. ⟨hal-03013197⟩



Consultations de la notice


Téléchargements de fichiers