One of the most amazing and expected characteristics in modern software systems is self-adaptation, the capability of monitoring, understanding, and responding to changes in a computational environment. A particular case of self-adaptation is affective-driven self-adaptation, which uses the consciousness of user’s affects (emotions) and drives self-adaptation that responds to changes in those affects.
Affective Companions are virtual characters used in research and industry to make information and complex tasks more accessible to customers and users. With the rise of new technologies in affect (emotion) recognition, companions are not all task-driven but focused on the social aspects of conversation. Affective Companions enhance the social quality of life of the user and that build deep relationships with their users.
The accomplishment of this project requires on one hand, gathering affect information using physiological sensors such as eye tracking, facial and gesture recognition, brain-computer interfaces, and motion capture; on the other hand, creating an intelligent virtual character able to show a high level of understanding of the meaning of user’s changes.
The proposed Affective Companion is envisioned to work inside a learning environment; therefore, focused on achieving a social relationship with students and help them to improve their performance. Figure 1 shows the conceptual architecture of the envisioned system.
The following video shows the current state of this project: