11th Annual Conference of the International Speech Communication Association

Makuhari, Chiba, Japan
September 26-30. 2010

A New Multichannel Multi Modal Dyadic Interaction Database

Viktor Rozgić, Bo Xiao, Athanasios Katsamanis, Brian R. Baucom, Panayiotis G. Georgiou, Shrikanth S. Narayanan

University of Southern California, USA

In this work we present a new multi-modal database for analysis of participant behaviors in dyadic interactions. This database contains multiple channels with close- and far-field audio, a high definition camera array and motion capture data. Presence of the motion capture allows precise analysis of the body language low-level descriptors and its comparison with similar descriptors derived from video data. Data is manually labeled by multiple human annotators using psychology-informed guides. This work also presents an initial analysis of approach-avoidance (A-A) behavior. Two sets of annotations are provided, one based on video only and the other obtained by using both the audio and video channels. Additionally, we describe the statistics of interaction descriptors and A-A labels on participants' roles. Finally we provide an analysis of relations between various non-verbal features and approach/avoidance labels.

Full Paper

Bibliographic reference.  Rozgić, Viktor / Xiao, Bo / Katsamanis, Athanasios / Baucom, Brian R. / Georgiou, Panayiotis G. / Narayanan, Shrikanth S. (2010): "A new multichannel multi modal dyadic interaction database", In INTERSPEECH-2010, 1982-1985.