
An augmented reality (AR) system could potentially assist those with severe motor impairments in operating a robot to perform routine tasks for them. Daily activities such as eating, tending to personal hygiene, and scratching an itch present challenges to these physically impaired patients, and this system described recently in the journal PLOS ONE could provide an effective solution.
The surrogate robot is equipped with cameras in its head, which the user has access to via the web-based interface. Clickable controls are displayed over the devices view, allowing the user to move the robot around its environment and control its arms and hands. For instance when moving the robots head, a pair of eyeballs are displayed as the mouse cursor to depict where the robot will look after clicking. Selecting a disc around the robotic hands allows the user to select various motions.
In the paper reporting on two studies of the robot body surrogate, it was found that these systems could significantly improve the user’s quality of life.
“Our results suggest that people with profound motor deficits can improve their quality of life using robotic body surrogates,” said Phillip Grice, a recent Georgia Institute of Technology Ph.D. graduate and first author of the paper. “We have taken the first step toward making it possible for someone to purchase an appropriate type of robot, have it in their home and derive real benefit from it.”
READ MORE: Auris Receives $220M Funding for Monarch Platform, Robotic Lung Endoscopy
Alongside Professor Charlie Kemp from the Wallace H. Coulter Department of Biomedical Engineering at Georgia Tech and Emory University, Grice used a PR2 mobile manipulator made by Willow Garage in both of the studies. The robot has 20 degrees of freedom, with two arms and a head, allowing it to handle objects like water bottles, washcloths, hairbrushes, and impressively, an electric shaver.
“Our goal is to give people with limited use of their own bodies access to robotic bodies so they can interact with the world in new ways,” said Kemp.
The first study made the PR2 available to 15 participants with sever motor impairments. Each of them learned to control the robot remotely using their own assistive equipment to dictate the robots actions. The participants were controlling the robot from their own homes, however the robot was with the researchers in a testing environment. The team found that 80 percent of the participants were able to make the robot bring a water bottle to the mouth of a mannequin.
“Compared to able-bodied persons, the capabilities of the robot are limited,” said Grice. “But the participants were able to perform tasks effectively and showed improvement on a clinical evaluation that measured their ability to manipulate objects compared to what they would have been able to do without the robot.”
In the second study, the researchers put the whole system up to the test by giving it to Henry Evans, a man with severe physical impairments who has been helping Georgia Tech researchers study assistive robotic systems since 2011. He tested the robot in his home for a week to not only complete simple tasks, but to unique tasks as well, such as having one arm control a washcloth and one a brush.
READ MORE: Johnson & Johnson May Have Sights Set on Robotic Surgery Firm, Auris Health
“The system was very liberating to me, in that it enabled me to independently manipulate my environment for the first time since my stroke,” said Evans. “With respect to other people, I was thrilled to see Phil get overwhelmingly positive results when he objectively tested the system with 15 other people.”
Cornell University’s PR2 robot uses the internet to learn how to make a latte http://t.co/IyIpZX2gNv pic.twitter.com/TgGhZDr4d2
— CNET (@CNET) April 23, 2015
Sources: Georgia Tech, Interesting Engineering