Machine learning algorithms and new techniques in biomedical engineering are leading to the creation of robotic hands that amputees can use to regain the functions of a real hand.
Recently, researchers at the University of Michigan tapped faint, latent signals from arm nerves and amplified them to create a prosthetic that enables real-time, intuitive, finger-level control of a robotic hand. To achieve this, the researchers developed a way to tame temperamental nerve endings, separate thick nerve bundles into smaller fibers that enable more precise control, and amplify the signals coming through those nerves.
The approach involves tiny muscle grafts and machine learning algorithms borrowed from the brain-machine interface field. This last bit is important because unlike other prosthetics where amputees must use them extensively in order to learn their functions properly, amputees who use the robotic hand featured in this study don’t need to learn anything—the algorithms have already done it for them.
While study participants aren’t yet allowed to take the arm home, in the lab, they were able to pick up blocks with a pincer grasp; move their thumb in a continuous motion, rather than have to choose from two positions; lift spherically shaped objects; and even play a version of Rock, Paper, Scissors called Rock, Paper, Pliers.
According to one study participant, “you can pretty much do anything you can do with a real hand with that [robot] hand.”