A Little Pick-me-up Goes a Long Way

Man using robotic arm device

If you want to pick up a warm mug of coffee, and not have any of it end up in your lap, it helps to have full sensory and motor control over your arms and hands. When your neurological and musculoskeletal assets work together, you can feel the weight and temperature of the mug and adjust your grip accordingly.

Tasks like that become much more difficult when a person relies on a prosthetic arm or—granted there aren’t many people doing this, but this is in our future—a robotic one.

University of Pittsburgh bioengineers from Pitt’s Rehab Neural Engineering Labs have found that adding brain stimulation that evokes tactile sensations makes it easier for an operator to manipulate a brain-controlled robotic arm. Their results were published in Science in May. Pitt’s Jennifer Collinger and Robert Gaunt were senior authors on the study. Both are associate professors of physical medicine and rehabilitation.

Nathan Copeland volunteered for the study. Copeland has limited use of his arms and legs after a car crash that also left him without feeling in his arms and hands.

After the researchers supplemented Copeland’s vision with artificial tactile perception, he was able to cut the time spent grasping and transferring objects in half, from a median time of 20.9 to 10.2 seconds.

This paper is a step forward from a 2016 study on sensation for which Copeland also volunteered. That paper described how stimulating sensory regions of the brain with tiny electrical pulses evoked sensation in distinct regions of his hand.

In this new study, Gaunt and Collinger’s team was able to offer sensory feedback to the robotic arm.

“Doing the task while receiving the stimulation just went together like PB&J,” says Copeland.

Gaunt says: “We still have a long way to go in terms of making the sensations more realistic and bringing this technology to people’s homes, but the closer we can get to recreating the normal inputs to the brain, the better off we will be.”