New facial analysis software is capable of recognizing emotional states, identifying struggling students
By End the Lie
A study recently published by researchers from North Carolina State University (NCSU) reveals that video cameras outfitted with facial analysis software can recognize the emotions of students and identify struggling students.
In this case, the researchers used video cameras to monitor the faces of college students who were being tutored via computer.
Software capable of matching facial expression with various levels of engagement or frustration allowed researchers to recognize when students were either not being challenged by the work or were finding it too difficult.
Similar software has made some quite amazing leaps in recent years, even demonstrating the ability to spot liars better than human experts.
This particular piece of software may be used in order to help teachers track the performance of students in real time or in online classrooms.
The advances “could even help massively open online courses (or MOOCs), which can involve many thousands of students working remotely, to be more attuned to students’ needs,” according to MIT Technology Review.
Will Knight, writing for the Technology Review, states that this could be part of a broader revolution in computing called “affective computing,” wherein various devices are capable of identifying and responding to emotion.
In case, the NCSU researchers recorded students as they learned to write Java code using the “JavaTutor” software.
Some 60 hours of video footage was analyzed using the Computer Expression Recognition Toolbox, a computer program that recognizes facial expressions.
The researchers then compared the software’s analysis with the students’ own reports of their state of mind and discovered that they were a close match.
Ultimately, they want to develop a tutoring system to assist students having difficulty with their studies and “bolster their confidence and keep them motivated,” said Joseph Grafsgaard, PhD student at NCSU who coauthored the paper, which will be presented at the Sixth International Conference on Educational Data Mining.
Knight points out that others are exploring potential applications for affective computing in the field of education.
One such individual is Jacob Whitehall, a software engineer and research scientist with Emotient.
Emotient is a startup which is focusing on commercial uses of affective computing in everything from market research to the triggering of “appropriate and meaningful responses from our digital world.”
Whitehall recently coauthored a paper, which has yet to be published, demonstrating that computer-identified facial expressions could actually predict test performance.
“There is an emerging agreement that facial-expression recognition can play a constructive role in teaching,” Whitehall said.
Yet at the same time, teachers’ performance can actually be hindered if they don’t know how or when to respond to feedback about the emotional state of a student.
“It’s a hard problem to know how to use these [emotion] sensors effectively,” Whitehall said.
Still, Whitehall contends that the technology could be quite useful for the increasingly popular online learning platforms out there.
“Udacity and Coursera have on the order of a million students, and I imagine some fraction of them could be persuaded to turn their webcams on,” Whitehall said. “I think you would learn a lot about what parts of a lecture are working and what parts are not, and where students are getting confused.”
It is interesting to note that this kind of technology is also being applied to areas not quite as admirable as helping student engagement and success, such as so-called “threat assessments.”
Hopefully these advances will indeed be used to create more successful learning environments instead of being leveraged for market research and threat assessment.
I’d love to hear your opinion, take a look at your story tips and even your original writing if you would like to get it published. Please email me at [email protected]