GenAI
NLP
InterSpeech

BARTpho: Pre-trained Sequence-to-Sequence Models for Vietnamese

June 16, 2022

We present BARTpho with two versions, BARTphosyllable and BARTphoword, which are the first public large-scale monolingual sequence-to-sequence models pre-trained for Vietnamese. BARTpho uses the “large” architecture and the pre-training scheme of the sequence-to-sequence denoising autoencoder BART, thus it is especially suitable for generative NLP tasks. We conduct experiments to compare our BARTpho with its competitor mBART on a downstream task of Vietnamese text summarization and show that: in both automatic and human evaluations, BARTpho outperforms the strong baseline mBART and improves the state-of-the-art. We further evaluate and compare BARTpho and mBART on the Vietnamese capitalization and punctuation restoration tasks and also find that BARTpho is more effective than mBART on these two tasks. We publicly release BARTpho to facilitate future research and applications of generative Vietnamese NLP tasks

Overall

< 1 minute

Nguyen Luong Tran, Duong Le, Dat Quoc Nguyen

InterSpeech 2022

Share Article

Related publications

GenAI
NLP
LREC-COLING
June 28, 2024

Nhu Vo, Dat Quoc Nguyen, Dung D. Le, Massimo Piccardi, Wray Buntine

GenAI
NLP
Findings of ACL
June 28, 2024

Minh-Vuong Nguyen, Linhao Luo, Fatemeh Shiri, Dinh Phung, Yuan-Fang Li, Thuy-Trang Vu, Gholamreza Haffari

GenAI
NLP
Findings of ACL
June 28, 2024

Tinh Son Luong, Thanh-Thien Le, Linh Van Ngo, and Thien Huu Nguyen

GenAI
NLP
ACL Top Tier
June 28, 2024

Trinh Pham*, Khoi M. Le*, Luu Anh Tuan