NLP EMNLP Findings

Transitioning Representations between Languages for Cross-lingual Event Detection via Langevin Dynamics

January 26, 2024

Cross-lingual transfer learning (CLTL) for event detection (ED) aims to develop models in high-resource source languages that can be directly applied to produce effective performance for lower-resource target languages. Previous research in this area has focused on representation matching methods to develop a language-universal representation space into which source- and target-language example representations can be mapped to achieve cross-lingual transfer. However, as this approach modifies the representations for the source-language examples, the models might lose discriminative features for ED that are learned over training data of the source language to prevent effective predictions. To this end, our work introduces a novel approach for cross-lingual ED where we only aim to transition the representations for the target-language examples into the source-language space, thus preserving the representations in the source language and their discriminative information. Our method introduces Langevin Dynamics to perform representation transition and a semantic preservation framework to retain event type features during the transition process. Extensive experiments over three languages demonstrate the state-of-the-art performance for ED in CLTL.

Overall

< 1 minute

Chien Nguyen, Huy Nguyen, Franck Dernoncourt, Thien Nguyen

Share Article

Related publications

NLP NAACL Top Tier
April 4, 2024

*Thanh-Thien Le, *Viet Dao, *Linh Van Nguyen, Nhung Nguyen, Linh Ngo Van, Thien Huu Nguyen

GA-LLM NLP NAACL Top Tier
April 4, 2024

Thang Le, Tuan Luu

NLP EMNLP Findings
January 26, 2024

Thang Le, Luu Anh Tuan