From Disfluency Detection to Intent Detection and Slot Filling

June 16, 2022

We present the first empirical study investigating the influence of disfluency detection on downstream tasks of intent detection and slot filling. We perform this study for Vietnamese — a low-resource language that has no previous study as well as no public dataset available for disfluency detection. First, we extend the fluent Vietnamese intent detection and slot filling dataset PhoATIS by manually adding contextual disfluencies and annotating them. Then, we conduct experiments using strong baselines for disfluency detection and joint intent detection and slot filling, which are based on pre-trained language models. We find that: (i) disfluencies produce negative effects on the performances of the downstream intent detection and slot filling tasks, and (ii) in the disfluency context, the pre-trained multilingual language model XLM-R helps produce better intent detection and slot filling performances than the pre-trained monolingual language model PhoBERT, and this is opposite to what generally found in the fluency context.


< 1 minute

Mai Hoang Dao, Thinh Hung Truong, Dat Quoc Nguyen

InterSpeech 2022

Share Article

Related publications

ICML Top Tier
May 14, 2024

Bao Nguyen, Binh Nguyen, Hieu Nguyen, Viet Anh Nguyen

NAACL Top Tier
April 4, 2024

Thang Le, Tuan Luu

CVPR Top Tier
March 4, 2024

Thuan Hoang Nguyen, Anh Tran

ICLR – Tiny Papers Track
February 14, 2024

Thanh-Thien Le, Linh The Nguyen, Dat Quoc Nguyen