TR-H-0203 :1996.11.13

Eric VATIKIOTIS-BATESON, Inge-Marie EIGSTI, Sumio YANO and Kevin MUNHALL

Perceiver Eye Motion during Audiovisual Speech Perception

Abstract:The eye movements of subjects were recorded during audiovisual presentations of extended monologues. Monologues were presented at different image sizes and with different levels of acoustic masking noise. Two clear targets of gaze fixation were identified, the eyes and the mouth. Regardless of image size, perceivers of both Japanese and English gazed more at the mouth as masking noise levels increased. However, even at the highest noise levels and largest image sizes, subjects gazed at the mouth only about half the time. For the eye target, perceivers typically gazed at one eye more than the other, and the tendency became stronger at higher noise levels. In the anlaysis of gaze fixation sequences, e.g.,left eye to mouth to left eye to right eye, English perceivers displayed more variety of gaze sequence patterns and persisted in using them at higher noise levels than did Japanese perceivers. No segment-level correlations were found between perceiver eye motions and phoneme identity of the stimuli.