ML
UAI

MOST: Multi-Source Domain Adaptation via Optimal Transport for Student-Teacher Learning

May 23, 2021

Multi-source domain adaptation (DA) is more challenging than conventional DA because the knowledge is transferred from several source domains to a target domain. To this end, we propose in this paper a novel model for multi-source DA using the theory of optimal transport and imitation learning. More specifically, our approach consists of two cooperative agents: a teacher classifier and a student classifier. The teacher classifier is a combined expert that leverages knowledge of domain experts that can be theoretically guaranteed to handle perfectly source examples, while the student classifier acting on the target domain tries to imitate the teacher classifier acting on the source domains. Our rigorous theory developed based on optimal transport makes this cross-domain imitation possible and also helps to mitigate not only the data shift but also the label shift, which are inherently thorny issues in DA research. We conduct comprehensive experiments on real-world datasets to demonstrate the merit of our approach and its optimal transport based imitation learning viewpoint. Experimental results show that our proposed method achieves state-of-the-art performance on benchmark datasets for multi-source domain adaptation including Digits-five, Office-Caltech10, and Office-31 to the best of our knowledge.

Overall

< 1 minute

Tuan Nguyen, Trung Le, He Zhao, Quan Hung Tran, Truyen Nguyen, Dinh Phung

UAI 2021

Share Article

Related publications

ML
ICML Top Tier
May 16, 2024

Vy Vo, He Zhao, Trung Le, Edwin V. Bonilla, Dinh Phung

ML
ICML Top Tier
May 16, 2024

Vy Vo, Trung Le, Tung-Long Vuong, He Zhao, Edwin V. Bonilla, Dinh Phung

ML
ICML Top Tier
May 14, 2024

Ngoc Bui, Hieu Trung Nguyen, Viet Anh Nguyen, Rex Ying

GenAI
ML
ICML Top Tier
May 14, 2024

Bao Nguyen, Binh Nguyen, Hieu Nguyen, Viet Anh Nguyen