You are here

A New Multi-modal Dataset for Human Affect Analysis

Authors: 

Haolin Wei, David Monaghan, Noel O'Connor, Patricia Scanlon

Publication Type: 
Refereed Conference Meeting Proceeding
Abstract: 
In this paper we present a new multi-modal dataset of spontaneous three way human interactions. Participants were recorded in an unconstrained environment at various locations during a sequence of debates in a video conference, Skype style arrangement. An additional depth modality was introduced, which permitted the capture of 3D information in addition to the video and audio signals. The dataset consists of 16 participants and is subdivided into 6 unique sections. The dataset was manually annotated on a continuously scale across 5 different affective dimensions including arousal, valence, agreement, content and interest. The annotation was performed by three human annotators with the ensemble average calculated for use in the dataset. The corpus enables the analysis of human affect during conversations in a real life scenario. We first briefly reviewed the existing affect dataset and the methodologies related to affect dataset construction, then we detailed how our unique dataset was constructed.
Conference Name: 
HBU 2014
Proceedings: 
5th International Workshop, HBU 2014
Digital Object Identifer (DOI): 
10.1007/978-3-319-11839-0_4
Publication Date: 
12/09/2014
Volume: 
8749
Issue: 
Human Behavior Understanding Lecture Notes in Computer Science
Pages: 
42-51
Conference Location: 
Switzerland
Research Group: 
Institution: 
Dublin City University (DCU)
Open access repository: 
Yes