People with amyotrophic lateral sclerosis (ALS) gradually lose control over the muscles that allow them to speak. And while there are some communication devices for ALS patients, most of them are quite cumbersome to use on a daily basis because of their bulky size and the fact that they’re difficult to carry around. Even worse, they’re not always accurate. That, however, may soon change thanks to a new device from MIT.
Looking to find a less obtrusive, more reliable alternative, researchers at MIT created a flexible, stretchable, and affordable device that users can temporarily apply to their facial skins, enabling them to communicate more freely.
Called cFaCES (conformable Facial Code Extrapolation Sensor), the new tool uses piezoelectric sensors made of aluminum nitride, which are embedded in a thin silicon film. The sensors detect the user’s facial movements and convert them into an electric current measured by an accompanying handheld processing unit.
As each facial movement, be it a twitch or a smile, produces a different current strength, the device is able to tell the difference between the various emotions that the user is trying to convey, including simple phrases like “I love you” or “I’m hungry.”
After conducting tests on two ALS patients, the researchers were delighted to find that the device was 75 percent accurate at distinguishing between three facial expressions — smiling, pursed lips, and opened mouth.
Currently, the scientists are working together with the patients to improve the accuracy of the technology and its ability to identify more facial movements. Once commercialized, it is estimated that the device could cost as little as $10.