Although there are a lot of weird headphones and TWS earbuds floating around the market, I bet none of those can track your facial expressions. I mean they are made for users to listen to AC/DC or System of a Down, not to track cheek muscles, right? Well, that is not the perception of the researchers at the Cornell University of New York who developed a pair of headphones to track facial expressions in real-time.
Dubbed as the C-Face, these unique over-the-ear-headphones, developed by Cornell researchers, sits on each of your ears and tracks the contours of your cheeks to determine your facial expression. These expressions are mirrored by a modeled-avatar living inside a virtual reality environment.
“This device is simpler, less obtrusive, and more capable than any existing ear-mounted wearable technologies for tracking facial expressions,” said Cheng Zhang, Assistant Professor of Information Science and the senior author of the paper, which will be presented at the Association for Computing Machinery Symposium on User Interface Software and Technology, starting virtually from October 20.
How Does It Work?
Now, the C-Face comes equipped with two tiny RGB cameras attached to each of its earcups. Thanks to these two cameras, the device is able to record the changes in facial contours at the time of facial muscle movements.
“The most exciting finding is that facial contours are highly informative of facial expressions. When we perform a facial expression, our facial muscles stretch and contract. They push and pull the skin and affect the tension of nearby facial muscles. This effect causes the outline of the cheeks (contours) to alter from the point of view of the ear,” wrote the researchers.
So, after the device captures the changes in the facial muscles, the images are re-built using a deep learning model and computer vision. And as the resulting images are in 2D, the researchers deployed an AI model, that specializes in classifying, detecting, retrieving images, to reconstruct the contour expressions.
You can check out the official demo video of the “C-Face” facial-tracking headphones right below.
Now, according to the researchers, the facial expressions, determined by 42 distinct feature points, can be turned into various emojis, including natural, kiss-face, and angry-face. Moreover, the tech can also enable “silent speech commands” that could help users control a music player just with facial expressions.
And well, as the device works by detecting the muscle movements of the face, it can continue tracking the expressions even when the user is wearing a face mask.