Office of Naval Research
Grant #N000141010143
Grant #N000141310438

Course:
Multimodal Detection of Affective States: A Roadmap Through Diverse Technologies

This course presents devices and explores methodologies for multimodal detection of affective states, as well as a discussion about presenter’s experiences using them both in learning and gaming scenarios.
 
Video Preview



 
Abstract
One important way for systems to adapt to their individual users is related to their ability to show empathy. Being empathetic implies that the computer is able to recognize a user’s affective states and understand the implication of those states. Detection of affective states is a step forward to provide machines with the necessary intelligence to appropriately interact with humans. This course provides a description and demonstration of tools and methodologies for automatically detecting affective states with a multimodal approach.
 

Objectives

  1. Describe the sensing devices used to detect affective states including brain-computer interfaces, face-based emotion recognition systems, eye-tracking systems, and physiological sensors.
  2. Compare the pros and cons of the sensing devices used to detect affective states.
  3. Describe the data that is gathered from each sensing device and its characteristics.
  4. Examine what it takes to gather, filter, and integrate affective data.
    Present approaches and algorithms used to analyze affective data and how it could be used to drive computer functionality or behavior.

 
This course is open to researchers, practitioners, and educators interested in incorporating detection of affective states as part of their technology toolbox.
 

A Sneak Peak of the Slides

 
Reference
Gonzalez-Sanchez J., Chavez-Echeagaray M.E., Atkinson R., and Burleson W. (2014). Multimodal Detection of Affective States: A Roadmap Through Diverse Technologies. In Extended Abstracts Proceedings of the 2014 ACM SIGCHI Conference on Human Factors in Computing Systems (CHI). Toronto, ON, Canada. May 2014. ACM. pp 1-2. doi:10.1145/2559206.2567820