Posted by & filed under Uncategorized.

Cathy Hutchinson (featured image) was a 53-year-old mother of two who, in 1996, suffered a brain-stem stroke, leaving her a quadriplegic. Ten years later, she became a research subject of a company called Cyberkinetics. The company implanted a device on her brain called the Utah array, “a pill-sized implant whose 96 microelectrodes bristle from its base like a bed of nails.”

Using this, Hutchinson was connected to a computer with two robotic arms by her side. She was instructed to think about positioning one of the arms by a nearby bottle; then, to think about grasping the bottle. Her doing so “prompted the arm to execute a complex gesture — lowering the hand, grasping the bottle and lifting it off the table.”

She then “brought the arm toward her and positioned it by her mouth. She thought again of squeezing her hand, which this time prompted the arm to tilt at the wrist, tipping the bottle so she could drink from its straw.” It was the first time in almost 15 years Hutchinson was able to lift something on her own. “The smile on her face,” said Leigh Hochberg, one of the scientists on the project, “was something I and our whole research team will never forget.”

Journalist Malcolm Gay’s new book “The Brain Electric” describes recent efforts to create brain/computer interfaces, which would, among other applications, allow people with disabilities to lift and hold objects by simply thinking about doing so. By implanting electrodes onto someone’s brain, scientists can map “the electric current of thought itself — the millions of electrical impulses, known as action potentials, that consciously volley between the brain’s estimated 100 billion neurons.”

This “electric language” — like “an exponentially complicated form of Morse code” — is “what makes consciousness possible.” By translating these neural signals into computer language, scientists can create “a brain-computer interface,” granting subjects “mental control over computers and machines.”

This happens by first mapping a subject’s thoughts. While connected to a computer, a subject may think — in a real-life example Gay shares conducted by scientist Eric Leuthardt — of lifting his left index finger. The computer would then analyze the neural patterns associated with this action and “correlate them with specific commands — anything from recreating the lifted finger in a robot hand to moving a cursor across a monitor or playing a video game.”

Once the scientist had decoded these patterns, he could then “conceivably link them to countless digital environments,” allowing the subject control over “everything from robotic appendages to Internet browsers.” While the endgame is still down the road, applications for this sort of technology could be vast. Quadriplegics could hold and lift things and even potentially walk again using computerized exoskeletons.

Sourced through Scoop.it from: nypost.com

See on Scoop.itGreen Living

Leave a Reply

  • (will not be published)