CV
AAAI

Count What You Want: Exemplar Identification and Few-shot Counting of Human Actions in the Wild

January 8, 2024

This paper addresses the task of counting human actions of interest using sensor data from wearable devices. We propose a novel exemplar-based framework, allowing users to provide exemplars of the actions they want to count by vocalizing predefined sounds “one”, “two”, and “three”. Our method first localizes temporal positions of these utterances from the audio sequence. These positions serve as the basis for identifying exemplars representing the action class of interest. A similarity map is then computed between the exemplars and the entire sensor data sequence, which is further fed into a density estimation module to generate a sequence of estimated density values. Summing these density values provides the final count. To develop and evaluate our approach, we introduce a diverse and realistic dataset consisting of real-world data from 37 subjects and 50 action categories, encompassing both sensor and audio data. The experiments on this dataset demonstrate the viability of the proposed method in counting instances of actions from new classes and subjects that were not part of the training data. On average, the discrepancy between the predicted count and the ground truth value is 7.47, significantly lower than the errors of the frequency-based and transformer-based methods. Our project, code and dataset can be found at https://github.com/cvlab-stonybrook/ExRAC.

Overall

2 minutes

Yifeng*, Duc Nguyen Duy*, Lam Nguyen Thanh, Cuong Pham, Minh Hoai

Share Article

Related publications

CV
WACV
July 11, 2024

Chau Pham*, Truong Vu*, Khoi Nguyen

CV
CVPR Top Tier
March 6, 2024

Supreeth Narasimhaswamy, Huy Nguyen, Lihan Huang, Minh Hoai

GenAI
CV
CVPR Top Tier
March 6, 2024

Ka Chun Shum, Jaeyeon Kim, Binh-Son Hua, Duc Thanh Nguyen, Sai-Kit Yeung

GenAI
CV
CVPR Top Tier
March 6, 2024

Phong Tran, Egor Zakharov, Long-Nhat Ho, Anh Tran, Liwen Hu, Hao Li