Skip to main content Skip to main navigation

Publikation

Only for the Unseen Languages, Say the Llamas: On the Efficacy of Language Adapters for Cross-lingual Transfer in English-centric LLMs

Julian Alexander Ruben Schlenker; Jenny Kunz; Tatiana Anikina; Günter Neumann; Simon Ostermann
In: Proceedings of the 63st Annual Meeting of the Association for Computational Linguistics (Student Research Workshop). ACL Student Research Workshop (ACL-IJCNLP-SRW-2025), located at ACL, July 27 - August 1, Vienna, Austria, Association for Computational Linguistics, 7/2025.

Zusammenfassung

Most state-of-the-art large language models (LLMs) are trained mainly on English data, limiting their effectiveness on non-English, especially low-resource, languages. This study investigates whether language adapters can facilitate cross-lingual transfer in English-centric LLMs. We train language adapters for 13 languages using Llama 2 (7B) and Llama 3.1 (8B) as base models, and evaluate their effectiveness on two downstream tasks (MLQA and SIB-200) using either task adapters or in-context learning. Our results reveal that language adapters improve performance for languages not seen during pre-training, but provide negligible benefit for seen languages. These findings highlight the limitations of language adapters as a general solution for multilingual adaptation in English-centric LLMs.

Projekte