Muscles tapped for virtual input

By Kimberly Patch, Technology Research News

It has long been a fantasy of computer junkies and more recently a dream of people who find typing painful to be able to communicate with computers more directly.

A group of NASA researchers is bringing the idea closer to reality by intercepting the electrical signals the brain and muscles use to communicate in order to provide phantom joystick and keypad control.

"We're looking at ... connecting human beings to machines at a neuroelectronic level [by] trying to understand the underlying nervous system signals," said Charles Jorgensen, who heads the neural engineering laboratory at the NASA-Ames Research Center.

The researchers are using electrodes to literally listen in on the body's communications system. "Dry electrodes... are sitting on the surface of the skin and they're picking up the electrical signals [that tell the muscles to contract] as they're being sent to muscles," said Jorgensen.

The signals are then sent to a computer, which interprets them using the hidden Markoff model algorithm commonly used for speech recognition, and neural net software, which is also good at recognizing patterns.

The researchers used a sleeve containing eight electrodes to allow a pilot to control a simulation of a 757 passenger jet, according to Jorgensen. "You can reach into the air and grab an imaginary joystick and move the wrist without any joystick actually in your hand and the neural signals are then interpreted and sent to a class four aircraft simulation," he said.

They've also used the system to enter data into an imaginary keypad. "We're understanding... slight movements of the fingers in order to enter data on a keypad without [an actual] keyboard -- you can type in the air or on a picture of a keyboard," Jorgensen said.

One motivation for developing the system is to make it possible for astronauts to use computers while wearing bulky spacesuits and to better control things like robot arms, said Jorgensen. "There are a number of applications -- everything from a wearable cockpit to exoskeletal manipulation to emergency communication modes such as in hazardous suits where the suit inflates."

The difficult part of tapping nerve signals is isolating and interpreting them correctly, said Jorgensen. "These are extremely weak signals in a high noise environment because we are measuring them on the surface of the skin,” he said.

Neurons fire by building up chemical energy in the form of sodium ions crossing a membrane, then release it all at once, causing a spike of electrical activity. "Each individual neuron has a little pulse of energy. The electrode is picking up thousands of individual nerves that are firing" at once, said Jorgensen. When the middle finger moves from the five to the eight on a keypad, for instance, the other fingers move as well, creating that much more noise on the surface of the skin.

"In the aggregate this winds up looking like a waveform," he said. This electromagnetic effect "propagates from cell to cell and [is] picked up by the sensors on the skin. We're just getting the average energy at different locations," Jorgensen said.

Electrical noise outside the body, like the electric fields produced by the flow of electricity in electrical hardware, also contribute to the electrical cacophony on the surface of the skin.

In addition to the noise, there are other problems in interpreting nerve signals through skin and over time. The electrodes are sensing the signals through a fat layer under the skin, and so the electrode position changes relative to the nerves as the hand moves, said Jorgensen. A cup of coffee can change things as well. "If you get caffeine in your system you are chemically a little different critter and your nervous system fires differently," he said.

The signals the electrodes find are amplified, filtered, and then interpreted using the hidden Markoff and neural net algorithms. "There are some distinctly different wrinkles in... the signal processing algorithms. You've got to get moving averages, generate probability density clusters, identify transition states that the information is going through, and you have to map [these pieces of information] to a pattern recognition scheme that lets you label them in one way or another," said Jorgensen.

Once the signals are sorted out, they're used to make changes in the position of the joystick or keys pressed on the keypad. The computing can be done using a 230 megahertz Pentium III chip, making it possible to use the system with a wearable computer, said Jorgensen.

It's an interesting approach that is part of the general move in computer communications towards a wider range of sensory inputs and outputs, said Terry Winograd, a professor of computer science at Stanford University. "It might... be relevant to mobile computing, where, for example, you could have a keyless keyboard, just detecting the muscle motions of the fingers," he said. It may also prove useful for helping people with disabilities who can't use ordinary devices, Winograd said.

The NASA researchers are working on expanding the keypad control to the entire keyboard. Also in the works are plans to make the system trainable. "We want to look at adaptive algorithms to permit customization for different users like you have for speech recognition," said Jorgensen.

The researchers are also exploring a project that will attempt to combine electromyographic signals from the muscles of the neck with electroencephalographic signals from the brain to interpret speech. "One of the things that we're exploring is whether or not we might be able to do silent speech recognition -- electrodes would pick up subvocalization behavior [in the neck] -- and combine that with [brain wave] information and that may be enough for us to tease out what's being spoken,” Jorgensen said.

The type of device control used in the joystick and keypad should be available for practical applications within five years, said Jorgensen.

Jorgensen's research colleagues were Kevin Wheeler of NASA and Slawomir Stepniewski of Recom Technology Corp. They presented research pertaining to the aircraft simulation model at the World Automation Congress Third International Symposium on Intelligent Automation and Control in Hawaii, June 11-16, 2000. The research was funded by NASA.

Timeline:   < 5 years
Funding:   Government
TRN Categories:   Human-Computer Interaction
Story Type:   News
Related Elements:  Technical paper, "Bioelectric Control of a 757 Class High Fidelity Aircraft Simulation," World Automation Conference, June 11-16, 2000, Hawaii.




Advertisements:



February 7, 2001

Page One

DNA conducts

DNA induced to superconductivity

Muscles tapped for virtual input

Tiny treads move minuscule robot

Proposal would marry atom and photon




News:

Research News Roundup
Research Watch blog

Features:
View from the High Ground Q&A
How It Works

RSS Feeds:
News  | Blog  | Books 



Ad links:
Buy an ad link

Advertisements:







Ad links: Clear History

Buy an ad link

 
Home     Archive     Resources    Feeds     Offline Publications     Glossary
TRN Finder     Research Dir.    Events Dir.      Researchers     Bookshelf
   Contribute      Under Development     T-shirts etc.     Classifieds
Forum    Comments    Feedback     About TRN


© Copyright Technology Research News, LLC 2000-2006. All rights reserved.