Doug Ramsey, UC San Diego
Computer scientists have developed a technology that uses facial expression recognition to detect how engaged students are during a class and to predict how well they will do in that class. The team, led by scientists at the University of California, San Diego and Emotient, a San Diego-based provider of facial expression recognition, showed that the technology was able to detect students’ level of engagement in real time just as accurately as human observers. The team also included researchers from Virginia Commonwealth University and Virginia State University.
The early online version of the paper, “The Faces of Engagement: Automatic Recognition of Student Engagement,” appeared today (April 15) in the journal, IEEE Transactions on Affective Computing.
“Automatic recognition of student engagement could revolutionize education by increasing understanding of when and why students get disengaged,” said Dr. Jacob Whitehill, Machine Perception Lab researcher in UC San Diego’s Qualcomm Institute and Emotient co-founder. “Automatic engagement detection provides an opportunity for educators to adjust their curriculum for higher impact, either in real time or in subsequent lessons. Automatic engagement detection could be a valuable asset for developing adaptive educational games, improving intelligent tutoring systems and tailoring massive open online courses, or MOOCs.”
Whitehill (Ph.D., ’12) recently received his doctorate from the Computer Science and Engineering department of UC San Diego’s Jacobs School of Engineering.
The study consisted of training an automatic detector, which measures how engaged a student appears in a webcam video while undergoing cognitive skills training on a tablet. The study used automatic expression recognition technology to analyze students’ facial expressions on a frame-by-frame basis and estimate their engagement level.
“This study is one of the most thorough to date in the application of computer vision and machine learning technologies for automatic student engagement detection,” said Javier Movellan, co-director of the Machine Perception Lab at UC San Diego and Emotient co-founder and lead researcher. “The possibilities for its application in education and beyond are tremendous. By understanding what parts of a lecture, conversation, game, advertisement or promotion produced different levels of engagement, an individual or business can obtain valuable feedback to fine-tune the material to something more impactful.”
In addition to Movellan and Whitehill, the study’s authors include Virginia Commonwealth professor of developmental psychology, Zewelanji Serpell, MD, as well as Yi-Ching Lin and Aysha Foster from the department of psychology at Virginia State.
Emotient was founded by a team of six Ph.D.s from UC San Diego, who are the foremost experts in applying machine learning, computer vision and cognitive science to facial behavioral analysis. Its proprietary technology sets the industry standard for accuracy and real-time delivery of facial expression data and analysis. Emotient’s facial expression technology is currently available as an API for Fortune 500 companies within consumer packaged goods, retail, health care, education and other industries.