デュック トゥリヌ, ジュリ ティヘリノ
An indirect approach to hand gesture recognition for applications
combining hand gestures and natural language
Abstract:This report introduces an indirect approach to hand gesture recognition for
applications combining hand gestures and natural language. In order to build an
intuitive 3D GUI, we plan to use natural language. We believe that hand gestures are
an important non verbal method for augmenting verbal communication, we intend to
integrate a hand gesture interpretation module with a 3DGUI to resolve the inherent
difficulties and ambiguities of processing natural language. This report describes a
study on natural hand gestures useful for manipulating, modelling and translating
3-D virtual objects. From experimental analysis we divided these gestures in two
categories. One category refers to gestures used to interact with virtual objects directly
in a single mode (e.g., grabbing), while the other category usually combines
information from other modes, such as natural language and can be useful for multimodal
interface. Usually several getures associate with a single operative concept.
Direct, template-based, approaches provide robust gesture recognition for a small set
of gestures, but are not intuitive to use because gestures usually differ from person to
person even for a single concept. Therefore we employed an indirect approach based
a neural networks implementation that determines the position of a focal point generalized
from experimental results. The focal point can be defined as the point on the
hand where the attention of a person viewing the gesture is focused. We found that
observation of this point gives sufficient information for interpreting some gesture's
intententions.