NLP
AAAI

Continual Relation Extraction via Sequential Multi-task Learning

January 8, 2024

To build continual relation extraction (CRE) models, those can adapt to an ever-growing ontology of relations, is a cornerstone information extraction task that serves in various dynamic real-world domains. To mitigate catastrophic forgetting in CRE, existing state-of-the-art approaches have effectively utilized rehearsal techniques from continual learning and achieved remarkable success. However, managing multiple objectives associated with memory-based rehearsal remains underexplored, often relying on simple summation and overlooking complex trade-offs. In this paper, we propose Continual Relation Extraction via Sequential Multi-task Learning (CREST), a novel CRE approach built upon a tailored Multi-task Learning framework for continual learning. CREST takes into consideration the disparity in the magnitudes of gradient signals of different objectives, thereby effectively handling the inherent difference between multi-task learning and continual learning. Through extensive experiments on multiple datasets, CREST demonstrates significant improvements in CRE performance as well as superiority over other state-of-the-art Multi-task Learning frameworks, offering a promising solution to the challenges of continual learning in this domain.

Overall

< 1 minute

Thanh-Thien Le, Manh Nguyen, Tung Thanh Nguyen, Linh Van Ngo, Thien Huu Nguyen

Share Article

Related publications

NLP
NAACL Findings
November 28, 2024

Thang Le

NLP
EMNLP Findings
November 28, 2024

Duy-Tung Pham*, Thien Trang Nguyen Vu*, Tung Nguyen*, Linh Van Ngo, Duc Anh Nguyen, Thien Huu Nguyen

GenAI
NLP
EMNLP
November 28, 2024

Quyen Tran*, Nguyen Xuan Thanh*, Nguyen Hoang Anh*, Nam Le Hai, Trung Le, Linh Van Ngo, Thien Huu Nguyen

GenAI
NLP
EMNLP Findings
November 28, 2024

Quang Hieu Pham*, Hoang Ngo*, Anh Tuan Luu, Dat Quoc Nguyen