TR-C-0059 :1991.1.30

グナセゲラン ムルヴァパン,竹村治雄,岸野文郎

HAND MOTION INTERPRETATION USING NEURAL NETWORKS

Abstract:This report describes a method of interpreting hand gestures. One of the applications could be manipulation of graphical objects on a computer screen. The devices used to detect hand and fingers position are the DataGlove and the 3SPACE Isotrak. The basis of this method is neural networks. We are interested only in translation, rotation and scaling operations, and we suppose only fingers are used to scale objects. Data given by the Tracker (coordinates and angles) are computed for translation and rotation. Those provided by the Glove are the inputs of the network that calculates a parameter defining the scaling operation (enlargement or shrinking). The use of a simple neural network turned out to be quite limited to recognize some complex gestures, such as continuous growth gesture. We added recurrent connections in order to give the network memory and some power of prediction. Experimental results show that recurrent networks give the best responses.