In order to participate in embodied interaction with humans, social robots must be able to recognize relevant social patterns, including interaction rhythms, imitation, and particular sequences of behaviors, and to relate them to particular socially meaningful interaction schemas. In this project we try to measure and quantify this by observing and recording interaction between humans doing shadow puppetry. Shadow puppetry provides a medium of interaction that is expressive enough to observe the phenomena that we are interested in and limited enough that the task of capturing and modeling the behavior of the paticipants is tractable.
Recording embodied interaction between humans.
Next we extract from each video sequence a stream of behavior primitives that represent the basic tokens of shadow puppetry. We treat the behavior of each of the participants at any instant as a random variable and build distributions to model the generative process their of interaction. We have built a behavior recognition system to automatically convert the video stream into gestural tokens in real time.
The perceptions system of our robot codes the gestural tokens of a human in real-time
The robot observes the behavior of the human and generates interactive behavior by sampling from learned joint distribution.
RPI Computer Science & STS Departments © 2008