{"year":"2025","page":"6420–6446","_id":"61753","title":"LOLA – An Open-Source Massively Multilingual Large Language Model","citation":{"mla":"Srivastava, Nikit, et al. “LOLA – An Open-Source Massively Multilingual Large Language Model.” Proceedings of the 31st International Conference on Computational Linguistics, edited by Owen Rambow et al., Association for Computational Linguistics, 2025, pp. 6420–6446.","ieee":"N. Srivastava et al., “LOLA – An Open-Source Massively Multilingual Large Language Model,” in Proceedings of the 31st International Conference on Computational Linguistics, 2025, pp. 6420–6446.","bibtex":"@inproceedings{Srivastava_Kuchelev_Moteu Ngoli_Shetty_Roeder_Zahera_Moussallem_Ngonga Ngomo_2025, place={Abu Dhabi, UAE}, title={LOLA – An Open-Source Massively Multilingual Large Language Model}, booktitle={Proceedings of the 31st International Conference on Computational Linguistics}, publisher={Association for Computational Linguistics}, author={Srivastava, Nikit and Kuchelev, Denis and Moteu Ngoli, Tatiana and Shetty, Kshitij and Roeder, Michael and Zahera, Hamada Mohamed Abdelsamee and Moussallem, Diego and Ngonga Ngomo, Axel-Cyrille}, editor={Rambow, Owen and Wanner, Leo and Apidianaki, Marianna and Al-Khalifa, Hend and Eugenio, Barbara Di and Schockaert, Steven}, year={2025}, pages={6420–6446} }","ama":"Srivastava N, Kuchelev D, Moteu Ngoli T, et al. LOLA – An Open-Source Massively Multilingual Large Language Model. In: Rambow O, Wanner L, Apidianaki M, Al-Khalifa H, Eugenio BD, Schockaert S, eds. Proceedings of the 31st International Conference on Computational Linguistics. Association for Computational Linguistics; 2025:6420–6446.","short":"N. Srivastava, D. Kuchelev, T. Moteu Ngoli, K. Shetty, M. Roeder, H.M.A. Zahera, D. Moussallem, A.-C. Ngonga Ngomo, in: O. Rambow, L. Wanner, M. Apidianaki, H. Al-Khalifa, B.D. Eugenio, S. Schockaert (Eds.), Proceedings of the 31st International Conference on Computational Linguistics, Association for Computational Linguistics, Abu Dhabi, UAE, 2025, pp. 6420–6446.","chicago":"Srivastava, Nikit, Denis Kuchelev, Tatiana Moteu Ngoli, Kshitij Shetty, Michael Roeder, Hamada Mohamed Abdelsamee Zahera, Diego Moussallem, and Axel-Cyrille Ngonga Ngomo. “LOLA – An Open-Source Massively Multilingual Large Language Model.” In Proceedings of the 31st International Conference on Computational Linguistics, edited by Owen Rambow, Leo Wanner, Marianna Apidianaki, Hend Al-Khalifa, Barbara Di Eugenio, and Steven Schockaert, 6420–6446. Abu Dhabi, UAE: Association for Computational Linguistics, 2025.","apa":"Srivastava, N., Kuchelev, D., Moteu Ngoli, T., Shetty, K., Roeder, M., Zahera, H. M. A., Moussallem, D., & Ngonga Ngomo, A.-C. (2025). LOLA – An Open-Source Massively Multilingual Large Language Model. In O. Rambow, L. Wanner, M. Apidianaki, H. Al-Khalifa, B. D. Eugenio, & S. Schockaert (Eds.), Proceedings of the 31st International Conference on Computational Linguistics (pp. 6420–6446). Association for Computational Linguistics."},"publication":"Proceedings of the 31st International Conference on Computational Linguistics","publisher":"Association for Computational Linguistics","user_id":"70066","main_file_link":[{"url":"https://aclanthology.org/2025.coling-main.428.pdf","open_access":"1"}],"place":"Abu Dhabi, UAE","language":[{"iso":"eng"}],"abstract":[{"lang":"eng","text":"This paper presents LOLA, a massively multilingual large language model trained on more than 160 languages using a sparse Mixture-of-Experts Transformer architecture. Our architectural and implementation choices address the challenge of harnessing linguistic diversity while maintaining efficiency and avoiding the common pitfalls of multilinguality. Our analysis of the evaluation results shows competitive performance in natural language generation and understanding tasks. Additionally, we demonstrate how the learned expert-routing mechanism exploits implicit phylogenetic linguistic patterns to potentially alleviate the curse of multilinguality. We provide an in-depth look at the training process, an analysis of the datasets, and a balanced exploration of the model{’}s strengths and limitations. As an open-source model, LOLA promotes reproducibility and serves as a robust foundation for future research. Our findings enable the development of compute-efficient multilingual models with strong, scalable performance across languages."}],"author":[{"id":"70066","first_name":"Nikit","last_name":"Srivastava","orcid":"0009-0004-5164-4911","full_name":"Srivastava, Nikit"},{"id":"70842","first_name":"Denis","last_name":"Kuchelev","full_name":"Kuchelev, Denis"},{"full_name":"Moteu Ngoli, Tatiana","last_name":"Moteu Ngoli","id":"99174","first_name":"Tatiana"},{"last_name":"Shetty","full_name":"Shetty, Kshitij","first_name":"Kshitij"},{"last_name":"Roeder","full_name":"Roeder, Michael","first_name":"Michael"},{"last_name":"Zahera","full_name":"Zahera, Hamada Mohamed Abdelsamee","orcid":"0000-0003-0215-1278","first_name":"Hamada Mohamed Abdelsamee","id":"72768"},{"id":"71635","first_name":"Diego","full_name":"Moussallem, Diego","last_name":"Moussallem"},{"full_name":"Ngonga Ngomo, Axel-Cyrille","last_name":"Ngonga Ngomo","first_name":"Axel-Cyrille","id":"65716"}],"status":"public","oa":"1","type":"conference","date_updated":"2025-12-02T19:29:20Z","date_created":"2025-10-08T11:02:30Z","editor":[{"first_name":"Owen","full_name":"Rambow, Owen","last_name":"Rambow"},{"full_name":"Wanner, Leo","last_name":"Wanner","first_name":"Leo"},{"first_name":"Marianna","full_name":"Apidianaki, Marianna","last_name":"Apidianaki"},{"first_name":"Hend","last_name":"Al-Khalifa","full_name":"Al-Khalifa, Hend"},{"first_name":"Barbara Di","full_name":"Eugenio, Barbara Di","last_name":"Eugenio"},{"first_name":"Steven","full_name":"Schockaert, Steven","last_name":"Schockaert"}]}