The buds are outfitted with electrodes that can detect and identify different facial muscle movements from changes in the shape of a wearer’s ear canal. Those readings are then used to send input commands to the phone.
“We’re not trying to replace current input methods, just complement them,” Matthies said in an interview with New Scientist. The facial sensing buds could potentially end up as an entirely new way to control your smartphone, using a mix of voice and touch commands and even AR camerafeatures.
The current version of the prototype can detect five expressions at a 90 percent success rate: smiling, winking, turning the head to the right, opening the mouth, and making a “shh” sound. It’s not quite the entire range of human expression—what about the 🙃 emoji?—but it’s a start. The system’s slightly less accurate when its wearers are on the move, boasting an 85.2 percent success rate.
Matthies and his team have written at length about their work, and plan to present the prototype at the ACM CHI Conference on Human Factors in Computing Systems in Denver next month.
There’s no way to know if and when the tech could make its way to your favorite brand of earbuds—and for now, Matthias told New Scientist it’s still just a research project. There’s no word of any partnerships yet, or if the team will strike out on their own to create the buds themselves.
But Matthies thinks the tech will be possible soon—so when you walk by someone on the street wearing earbuds and grinning at nothing in particular in the future, there’s a chance they’re maybe just sending out a happy text. There are worse reasons to s