Objects in Action: An Approach for Combining Action Understanding and Object Perception
Title | Objects in Action: An Approach for Combining Action Understanding and Object Perception |
Publication Type | Conference Papers |
Year of Publication | 2007 |
Authors | Gupta A, Davis LS |
Conference Name | Computer Vision and Pattern Recognition, 2007. CVPR '07. IEEE Conference on |
Date Published | 2007/06// |
Keywords | analysis;image, approach;action, Bayesian, classification;image, classification;object, framework;Bayes, interactions;inference, interpretation, localization;object, methods;gesture, MOTION, movements;human, perception;, perception;human-object, perception;object, process;object, processing;visual, recognition;image, recognition;video, segmentation;action, segmentation;object, signal, understanding;human |
Abstract | Analysis of videos of human-object interactions involves understanding human movements, locating and recognizing objects and observing the effects of human movements on those objects. While each of these can be conducted independently, recognition improves when interactions between these elements are considered. Motivated by psychological studies of human perception, we present a Bayesian approach which unifies the inference processes involved in object classification and localization, action understanding and perception of object reaction. Traditional approaches for object classification and action understanding have relied on shape features and movement analysis respectively. By placing object classification and localization in a video interpretation framework, we can localize and classify objects which are either hard to localize due to clutter or hard to recognize due to lack of discriminative features. Similarly, by applying context on human movements from the objects on which these movements impinge and the effects of these movements, we can segment and recognize actions which are either too subtle to perceive or too hard to recognize using motion features alone. |
DOI | 10.1109/CVPR.2007.383331 |