Yuejiao Su   Yi Wang   Lap-Pui Chau
The Hong Kong Polytechnic University
Egocentric Interactive hand-object segmentation (EgoIHOS) requires the segmentation of hands and interacting objects in egocentric images, which is crucial for understanding human behavior in assistive systems. Previous methods typically recognize hands and interacting objects as distinct semantic categories based solely on visual features, or simply use hand predictions as auxiliary cues for object segmentation. Despite the promising progress achieved by these methods, they fail to adequately model the interactive relationships between hands and objects while ignoring the coupled physical relationships among object categories, ultimately constraining their segmentation performance. To make up for the shortcomings of existing methods, we propose a novel method called CaRe-Ego that achieves state-of-the-art performance by emphasizing the contact between hands and objects from two aspects. First, we introduce a Hand-guided Object Feature Enhancer (HOFE) to establish the hand-object interactive relationships to extract more contact-relevant and discriminative object features. Second, we design the Contact-centric Object Decoupling Strategy (CODS) to explicitly model and disentangle coupling relationships among object categories, thereby emphasizing contact-aware feature learning. Experiments on various in-domain and out-of-domain test sets show that Care-Ego significantly outperforms existing methods with robust generalization capability.
@misc{su2025careegocontactawarerelationshipmodeling,
title={CaRe-Ego: Contact-aware Relationship Modeling for Egocentric Interactive Hand-object Segmentation},
author={Yuejiao Su and Yi Wang and Lap-Pui Chau},
year={2025},
eprint={2407.05576},
archivePrefix={arXiv},
primaryClass={cs.CV},
url={https://arxiv.org/abs/2407.05576},
}
News: The mini-HOI4D dataset and the checkpoint of the best model have been released.
mini-HOI4D. Google Drive.
Model. Google Drive.