Office of Naval Research
Grant #N000141010143
Grant #N000141310438

How to Do Multimodal Detection of Affective States?

IEEE International Conference on Advanced Learning Technologies
Athens, Georgia, USA. July 2011

Considering that the human-element as crucial in designing and implementing interactive intelligent systems, this tutorial provides a description and hands-on demonstration on detection of affective states and a description of devices, methodologies and data processing, as well as their impact in instructional design. The information that a computer senses in order to automate the detection of affective states, includes an extensive set of data, it could ranges from brain-waves signals and biofeedback readings from face-based or gesture emotion recognition and posture or pressure sensing. The work presented in this tutorial, is not about the development of the algorithms or hardware that make this works, our concerns are about the encapsulation of preexisting systems (we are actually using all of them) that implements those algorithms and uses these hardware to improve Learning.

IEEE Digital Library
Javier Gonzalez-Sanchez, Robert M. Christopherson, Maria Elena Chavez-Echeagaray, David C. Gibson, Robert Atkinson, Winslow Burleson, “ How to Do Multimodal Detection of Affective States?,” icalt, pp.654-655, 2011 IEEE 11th International Conference on Advanced Learning Technologies, 2011

These are our slides for the tutorial, any comment is more than welcome.