Title | Partially Occluded Hands: A challenging new dataset for single-image hand pose estimation |
Publication Type | Conference Paper |
Year of Publication | 2018 |
Authors | Myanganbayar, B, Mata, C, Dekel, G, Katz, B, Ben-Yosef, G, Barbu, A |
Conference Name | The 14th Asian Conference on Computer Vision (ACCV 2018) |
Date Published | 12/2018 |
Keywords | dataset, Partial occlusion, RGB hand-pose reconstruction |
Abstract | Recognizing the pose of hands matters most when hands are interacting with other objects. To understand how well both machines and humans perform on single-image 2D hand-pose reconstruction from RGB images, we collected a challenging dataset of hands interacting with 148 objects. We used a novel methodology that provides the same hand in the same pose both with the object being present and occluding the hand and without the object occluding the hand. Additionally, we collected a wide range of grasps for each object designing the data collection methodology to ensure this diversity. Using this dataset we measured the performance of two state-of-the-art hand-pose recognition methods showing that both are extremely brittle when faced with even light occlusion from an object. This is not evident in previous datasets because they often avoid hand- object occlusions and because they are collected from videos where hands are often between objects and mostly unoccluded. We annotated a subset of the dataset and used that to show that humans are robust with respect to occlusion, and also to characterize human hand perception, the space of grasps that seem to be considered, and the accuracy of reconstructing occluded portions of hands. We expect that such data will be of interest to both the vision community for developing more robust hand-pose algorithms and to the robotic grasp planning community for learning such grasps. The dataset is available at occludedhands.com |
URL | http://accv2018.net/ |
Associated Module:
Research Area:
CBMM Relationship:
- CBMM Funded