Office of Naval Research
Grant #N000141010143
Grant #N000141310438

Presentation
Video Game Preferences and Personality in Undergraduate Students

On August 30, 2011, John M. Quick presented the results of his recent research on video games and personality at the EARLI 2011 conference in Exeter, UK. This presentation provided an overview of the game design and player type framework that originated from a fall 2010 survey of nearly 300 undergraduate students. The slides from the presentation are available for download as a PDF file. The discussed framework is composed of six game design features that influence student enjoyment of video games and six player types that were found in the undergraduate student population. Please contact John [dot] M [dot] Quick [at] asu [dot] edu for more information on this research.

About
Intelligent Tutoring Project

In the summer of 2010 the ANGLE Lab partnered with Mountain Pointe High School in the Tempe School District to design and administer a summer school Algebra program using math-based intelligent tutoring systems. The partnership was designed to allow ANGLE Lab on behalf of ONR to evaluate the effectiveness of three off-the-shelf math-based intelligent tutoring systems and Mountain Pointe High School to upgrade their computer-based summer school Algebra course.  This partnership then continued in Fall 2010 for a curriculum integration project where the intelligent tutors were evaluated as practice tools within a remedial, high-school math course as well as in an after school AIMS remediation program.

In Summer 2011, ANGLE Lab partnered with the Mesa School District to design and administer another summer school remediation program for Algebra students. In Fall 2012, the partnership with Mesa School District continued as the ALEKS intelligent tutor  was selected based on usability factors to be designed into the regular school year, Algebra remediation curriculum and is being evaluted in conjuction with self-regulated learning instruction that teaches the students how to most effectively practice using the tutoring systems.

All four of these collaborative, field studies were designed to evaluate off-the-shelf intelligent tutoring systems while providing high school students the opportunity remediate their Algebra skills. In each of these studies, ANGLE Lab provided the tutoring software and two researchers that also served as the technical support staff to allow the high-school math teachers to focus on assisting students rather than troubleshooting the software.

Paper
Evaluating Adaptive, Computer-Based Mathematics Tutoring Systems A Math Improvement and Feasibility Study

Abstract
This study evaluated two off-the-shelf, adaptive and computer-based mathematics tutoring systems that teach algebra and other foundational math skills. Thirty high school algebra students that failed to pass algebra in the previous semester were randomly assigned in equal proportions to work with Carnegie Learning’s Cognitive Tutor or ALEKS algebra course product. Using the tutoring system exclusively, the students completed a 4-hour-a-day, 14 day summer school high school algebra class for credit. The results revealed that both tutoring systems produced very large learning gains, on the magnitude of a two-sigma effect, on measures of arithmetic and algebra knowledge.

AERA 2011 Evaluating Adaptive Mathematics Tutoring Systems

Paper
Affective Computing Meets Design Patterns: A Pattern-Based Model of A Multimodal Emotion Recognition Framework

Gonzalez-Sanchez, J., Chavez-Echeagaray, M.E., Atkinson, R., and Burleson, W., “Affective Computing Meets Design Patterns:
A Pattern-Based Model of A Multimodal Emotion Recognition Framework”, Proceedings of 16th European Conference on Pattern Languages of Programs (July 2011) In print.

 
 
Abstract
The computer’s ability to recognize human affective state given physiological signals is gaining in popularity to create empathetic systems such as learning environments, health care systems and videogames. Despite that, there are few frameworks, libraries, architectures, or software tools that allow systems developers to easily integrate emotion recognition into their software projects, and most of them are monomodal. The work reported here offers a way to fill this gap in the lack of models, addressing the use of software patterns to model a pattern-based approach for multimodal emotion recognition.

Tutorial
How to Do Multimodal Detection of Affective States?

IEEE International Conference on Advanced Learning Technologies
Athens, Georgia, USA. July 2011
Tutorial.

 
 
 
Abstract
Considering that the human-element as crucial in designing and implementing interactive intelligent systems, this tutorial provides a description and hands-on demonstration on detection of affective states and a description of devices, methodologies and data processing, as well as their impact in instructional design. The information that a computer senses in order to automate the detection of affective states, includes an extensive set of data, it could ranges from brain-waves signals and biofeedback readings from face-based or gesture emotion recognition and posture or pressure sensing. The work presented in this tutorial, is not about the development of the algorithms or hardware that make this works, our concerns are about the encapsulation of preexisting systems (we are actually using all of them) that implements those algorithms and uses these hardware to improve Learning.

IEEE Digital Library
Javier Gonzalez-Sanchez, Robert M. Christopherson, Maria Elena Chavez-Echeagaray, David C. Gibson, Robert Atkinson, Winslow Burleson, “ How to Do Multimodal Detection of Affective States?,” icalt, pp.654-655, 2011 IEEE 11th International Conference on Advanced Learning Technologies, 2011

Slides
These are our slides for the tutorial, any comment is more than welcome.

Paper
ABE: An Agent-Based Software Architecture for a Multimodal Emotion Recognition Framework

Gonzalez-Sanchez, J., Chavez-Echeagaray, M.E., Atkinson, R., and Burleson, W., “ABE: An Agent-Based Software Architecture for a Multimodal Emotion Recognition Framework”, Proceedings of 9th Working IEEE/IFIP Conference on Software Architecture (June 2011).

 
 
Abstract

The computer’s ability to recognize human emotional states given physiological signals is gaining in popularity to create empathetic systems such as learning environments, health care systems and videogames. Despite that, there are few frameworks, libraries, architectures, or software tools, which allow systems developers to easily integrate emotion recognition into their software projects. The work reported here offers a first step to fill this gap in the lack of frameworks and models, addressing: (a) the modeling of an agent-driven component-based architecture for multimodal emotion recognition, called ABE, and (b) the use of ABE to implement a multimodal emotion recognition framework to support third-party systems becoming empathetic systems.

 
IEEE Digital Library
Javier Gonzalez-Sanchez, Maria Elena Chavez-Echeagaray, Robert Atkinson, Winslow Burleson, “ ABE: An Agent-Based Software Architecture for a Multimodal Emotion Recognition Framework,” wicsa, pp.187-193, 2011 Ninth Working IEEE/IFIP Conference on Software Architecture, 2011.

 
Slides
These are the slides for the paper presentation at WICSA, any comments are more than welcome. The presenters of the papers briefly introduce our work in the first half (45 minutes) of the session and we have the second half to brainstorm questions with the attendants.

 
Emotiv Developer Published Papers

Short Talk
Covert and Overt Measures of Engagement Within an Educational Multimedia Environment

8th Annual Games for Change Festival,
New Yourk, NY, USA. June 2011.
Short Talk
Wednesday (June 22) — NYU Law School (40 Washington Square South), Greenberg Lounge.

Abstract

Within the social sciences there is a well-established base of research that has helped to shed light on the complexity of human social interactions. Recently, in the developing arena of game design and research, there has been a shift from expert-driven experience decisions and a market research orientation to a more objective, user centric, data-driven scientific approach. While the gaming community is making more of an effort to empirically assess the quality of users’ experiences, the educational and learning sciences are adopting more game-like features to help improve engagement within digital learning environments, reduce frustration and in general assess students’ emotional states to help them persist through challenging tasks.

In addition to each field mutually benefiting from each another’s strengths, the fields are also simultaneously capitalizing on recent advances in physiological sensing devices. These devices capture data as users interact with various systems, which is used to help improve assessment of user experiences as well as enhancing user engagement.

An example of this symbiotic relationship is the use of physiological data gathered while a user is immersed in a gaming scenario; these data are used to better understand a user’s cognitive and affective states as it relates to their performance within the game. Recently, studies exploring the feasibility of such a scenario. One set up utilized the integration of a high fidelity graphic, deeply engaging, well-known video game called Guitar Hero as the stimulus with a suite of unobtrusive psychophysiological sensing devices, that captured a user’s affective states during the game. These sensing devices included the Emotiv® EPOC headset, Tobii ® Eye Tracking System, customized pressure sensors game controller (guitar) and a skin conductance sensor.

Utilizing a multimodal approach for sensing, integrating, and synchronizing the data, we investigated the different levels of engagement as detected by the EEG Emotiv® EPOC headset, the visual attention captured through unobtrusive eye tracking methods, pressure on the buttons of the guitar and arousal of the emotions captured by other sensing devices.

The goal of this paper is to present the state-of-the-art advances in the fields of educational technology, computer science, psychology and data mining used for assessing users’ affective states and explain how these advances can be used to objectively measure and enhance the user experience within an interactive multimedia environment (e.g., computer games, and intelligent tutors).

Slides

These are our slides for the short talk, any comments are more than welcome.

Presentations: Video Game Design and Learner Characteristics

During the 2010-2011 school year, LSRL member John M. Quick lead two research studies that sought to identify relationships between undergraduate learner characteristics and their enjoyment of video games.

Information about video game feature preferences, play habits, and personality traits was collected from hundreds of undergraduate learners during the fall and spring semesters. These data are being used to identify the key design features that most influence learners’ perception of video game experiences and how personality traits relate to learners’ preferences for and use of video games.

John will present this work at two upcoming conferences in 2011. In June, he will be at the Games, Learning, and Society conference in Madison, WI to share the six key design features that influence student enjoyment of video games and discuss the impacts of involuntary play in formal settings on learner experience. In September, John will be at the EARLI conference in Exeter, UK to share the undergraduate learner personas that emerged from the analysis of students’ video game preferences, play habits, and personality traits. You are welcome to contact John about this research and related topics at John.M.Quick [at] asu [dot] edu.

Presentation
Building an Emotion Recognition Framework

Center for Cognitive Ubiquitous Computing (CUbiC) Seminar Series.
Tempe, Arizona, USA. April 2011.
Short Talk
Friday (April 15) — ASU Tempe Main Campus. Room 380 of the Brickyard building.

 

Abstract

The computer’s ability to recognize human emotional states given physiological signals is gaining in popularity to create empathetic systems such as learning environments, health care systems and videogames. Despite that, there are few frameworks, libraries, architectures, or software tools, which allow systems developers to easily integrate emotion recognition into their software projects.

In that context, our work (colaborationg with the Motivational Environment Group, the Learning Science Research Lab and the Affective Metatutor Group) is addressing the construction of a multimodal emotion recognition framework to support third-party systems becoming empathetic systems. Our approach is multimodal and includes: Brain-Computer Interfaces, Eye Tracking, Face-Based Emotion Recognition, and Sensors to measure Physiological Signals (such as skin conductivity, posture, and finger pressure).

Come to our talk and let us share with you some hands-on demos and ideas.

 
Slides

These are my slides for the short talk, any comment is more than welcome.

Presentation
Virtual Worlds Best Practices in Education

The fourth annual Virtual Worlds Best Practices in Education conference was held from Thursday, March 17, 2011 until Saturday, March 19, 2011. The VWBPE focuses on education and learning in virtual environments. The theme of this year is “You are Here”. The event covered over 20 sims in the virtual world of Second Life and included workshops, presentations, and more.

The Virtual World Best Practices in Education (VWBPE) conference originated from the 2007 Second Life Best Practices in Education Conference. This grassroots, community-based conference attracts faculty, instructors, trainers, administrators, instructional designers, technical specialists, and members of organizations from around the world. Those who create teaching/learning environments, resources, tools, support services and professional development opportunities internal and external to virtual world environments participate. During the conference, participants have opportunities to ask: What is education?, What is teaching?, What is learning? and How can we provide virtual world educational environments in which today’s learners can become all they can be.

See you in Second Life !
:-)

Slides and Video
Our research group presented in this event, on Thursday March 17, these are our slides and the video of the presentation. Any comment is more than welcome.

From treet.tv: find out about the fascinating neuroscience work being done with special EEG headsets to map emotions, engagement and thinking by monitoring brain patterns. Dr. Gibosn’s team is from Arizona State University.