Transformers on Multilingual Clause-Level Morphology

Acikgoz EC, Chubakov T, Kural M, Şahin GG, Yuret D (2022)


Publication Type: Conference contribution

Publication year: 2022

Publisher: Association for Computational Linguistics (ACL)

Pages Range: 100-105

Conference Proceedings Title: MRL 2022 - 2nd Workshop on Multi-Lingual Representation Learning, Proceedings of the Workshop

Event location: Abu Dhabi, ARE

ISBN: 9781959429166

Abstract

This paper describes our winning systems in MRL: The 1st Shared Task on Multilingual Clause-level Morphology (EMNLP 2022 Workshop) designed by KUIS AI NLP team. We present our work for all three parts of the shared task: inflection, reinflection, and analysis. We mainly explore transformers with two approaches: (i) training models from scratch in combination with data augmentation, and (ii) transfer learning with prefix-tuning at multilingual morphological tasks. Data augmentation significantly improves performance for most languages in the inflection and reinflection tasks. On the other hand, Prefix-tuning on a pre-trained mGPT model helps us to adapt analysis tasks in low-data and multilingual settings. While transformer architectures with data augmentation achieved the most promising results for inflection and reinflection tasks, prefix-tuning on mGPT received the highest results for the analysis task. Our systems received 1st place in all three tasks in MRL 2022.

Involved external institutions

How to cite

APA:

Acikgoz, E.C., Chubakov, T., Kural, M., Şahin, G.G., & Yuret, D. (2022). Transformers on Multilingual Clause-Level Morphology. In MRL 2022 - 2nd Workshop on Multi-Lingual Representation Learning, Proceedings of the Workshop (pp. 100-105). Abu Dhabi, ARE: Association for Computational Linguistics (ACL).

MLA:

Acikgoz, Emre Can, et al. "Transformers on Multilingual Clause-Level Morphology." Proceedings of the 2nd Workshop on Multi-Lingual Representation Learning, MRL 2022, Abu Dhabi, ARE Association for Computational Linguistics (ACL), 2022. 100-105.

BibTeX: Download