MEXA: Multilingual Evaluation of Open English-Centric LLMs via Cross-Lingual Alignment

About

We introduce MEXA, a method for assessing the multilingual capabilities of English-centric large language models (LLMs). MEXA builds on the observation that English-centric LLMs semantically use English as a kind of pivot language in their intermediate layers. Mexa computes the alignment between non-English languages and English using parallel sentences, estimating the transfer of language understanding capabilities from English to other languages through this alignment. This metric can be useful in estimating task performance, provided we know the English performance in the task and the alignment score between languages derived from a parallel dataset.

Code

https://github.com/cisnlp/MEXA

Details

We use parallel datasets from FLORES and the Bible. In the ARC style, we use mean pooling over layers, and the English score achieved by each LLM in the ARC benchmark is used to adjust the multilingual scores. In the Belebele style, we use max pooling over layers, and the English score achieved by each LLM in Belebele is used to adjust the multilingual scores.

Parallel Dataset

Choose parallel dataset!

Citation

@inproceedings{kargaran-etal-2025-mexa,
    title = "{MEXA}: Multilingual Evaluation of {E}nglish-Centric {LLM}s via Cross-Lingual Alignment",
    author = "Kargaran, Amir Hossein  and
      Modarressi, Ali  and
      Nikeghbal, Nafiseh  and
      Diesner, Jana  and
      Yvon, Fran{\c{c}}ois  and
      Schuetze, Hinrich",
    editor = "Che, Wanxiang  and
      Nabende, Joyce  and
      Shutova, Ekaterina  and
      Pilehvar, Mohammad Taher",
    booktitle = "Findings of the Association for Computational Linguistics: ACL 2025",
    month = jul,
    year = "2025",
    address = "Vienna, Austria",
    publisher = "Association for Computational Linguistics",
    url = "https://aclanthology.org/2025.findings-acl.1385/",
    doi = "10.18653/v1/2025.findings-acl.1385",
    pages = "27001--27023",
    ISBN = "979-8-89176-256-5",
}