Gesture Spotting Essay

9048 Words37 Pages
256 IEEE TRANSACTIONS ON ROBOTICS, VOL. 23, NO. 2, APRIL 2007 Gesture Spotting and Recognition for Human–Robot Interaction Hee-Deok Yang, A-Yeon Park, and Seong-Whan Lee, Senior Member, IEEE Abstract—Visual interpretation of gestures can be useful in accomplishing natural human–robot interaction (HRI). Previous HRI research focused on issues such as hand gestures, sign language, and command gesture recognition. Automatic recognition of whole-body gestures is required in order for HRI to operate naturally. This presents a challenging problem, because describing and modeling meaningful gesture patterns from whole-body gestures is a complex task. This paper presents a new method for recognition of whole-body key gestures in HRI. A human subject is first described by a set of features, encoding the angular relationship between a dozen body parts in 3-D. A feature vector is then mapped to a codeword of hidden Markov models. In order to spot key gestures accurately, a sophisticated method of designing a transition gesture model is proposed. To reduce the states of the transition gesture model, model reduction which merges similar states based on data-dependent statistics and relative entropy is used. The experimental results demonstrate that the proposed method can be efficient and effective in HRI, for automatic recognition of whole-body key gestures from motion sequences. Index Terms—Gesture spotting, hidden Markov model (HMM), human–robot interaction (HRI), mobile robot, transition gesture model, whole-body gesture recognition. I. INTRODUCTION OBOTICS research is currently supported in a dynamic environment. Traditional robots were used in factories for the purpose of manufacturing, transportation, and so on. Recently, a new generation of “service robots” has begun to emerge [31]. The United Nations (UN), in their recent robotics

More about Gesture Spotting Essay

Open Document