Jacob Eisenstein

Giving Discourse Processing a Hand: Gesture cues for Discourse Structure

 

Computers cannot fully understand spoken language without access to the wide range of modalities that accompany speech. My research addresses the particularly expressive modality of hand gesture, and focuses on building structured statistical models at the intersection of speech, vision, and meaning. While individual gestures may be idiosyncratic and thus difficult to interpret, we can still leverage gesture to improve language understanding by identifying patterns between gestures, and hypothesizing similar patterns in the meaning of the associated speech. I will describe successful applications of this idea to resolve ambiguous noun phrases, improve topic segmentation, and create keyframe summaries of spoken language.

 

 

 

 

 

Official inquiries about AIIS should be directed to Alexandre Klementiev (klementi AT uiuc DOT edu)
Last update: 01/22/2008