Imagine being able to land a jumbo jet without ever taking control of the
stick. NASA scientists recently demonstrated the ability to control a 757
passenger jet simulation, using only human muscle-nerve signals linked to
a computer.
Scientists outfitted the pilot with an armband implanted with eight
electrodes. The sensors read muscle nerve signals as the pilot made the
gestures needed to land a computer-generated aircraft at San Francisco
International Airport in California. The pilot also demonstrated the
ability to land a damaged aircraft during emergency landing drills. The
work was reported in the October 2000 proceedings of the World Automation
Congress.
"This is a fundamentally new way to communicate with machines -- another
way to talk with our mechanical world," said the paper's principal author,
Dr. Charles Jorgensen, head of the neuroengineering laboratory at NASA's
Ames Research Center, Moffett Field, CA. The other authors are fellow Ames
researchers Dr. Kevin Wheeler and Dr. Slawomir Stepniewski. "This new
technology is significant in that neuroelectric control of computers can
replace computer keyboards, mice and joysticks for some uses," Jorgensen
added.
"In the experiment, a pilot closes his fist in empty air, makes movements
and creates nerve signals that are captured by a dry electrode array on
his arm," said Jorgensen. "The nerve signals are analyzed and then routed
through a computer, allowing the pilot to control the simulated airplane."
The pilot sees the aircraft and control panel projected on a large,
dome-shaped screen while flying the aircraft.
Engineers made the first prototype armband from exercise tights, and used
metallic dress-buttons as dry electrodes. "An advantage of using
neuroelectric machine control is that human nerve signals can be linked
directly with devices without the aid of joysticks or mice, thereby
providing rapid, intuitive control," Jorgensen added. "This technology
also is useful for astronauts in spacesuits who need to control tools in
space."
Neuroelectric control uses "neural net" software that "learns" patterns
that can slowly change and evolve with time, as well as combining many
patterns together to generate a response.
Nerve signal patterns, each of which is potentially as unique as a
fingerprint, are a perfect application for neural net software. A
particular nerve-signal pattern tells muscles to move in a certain way. A
computer can match each unique nerve-signal pattern with a particular
gesture, such as making a fist or pointing. Scientists designed software
that can adjust for each pilot's nerve patterns, which can be affected by
caffeine use, biorhythms, performance stress and the amount of fat under
the skin.
To demonstrate bioelectric muscle control of the simulated 757 airplane
during emergencies, researchers combined this technology with two other
NASA developments, the ability of the neural net software to learn to fly
damaged airplanes, and propulsion-only landing of aircraft.
In about one-sixth of a second, a computer onboard a damaged aircraft can
"relearn" to fly a plane, giving the pilot better control. Severe damage,
such as partially destroyed wings, fuselage holes or sensor failures
greatly alter how an airplane handles, and a pilot's controls may respond
oddly or might not work at all, according to Jorgensen.
"When we combined the three technologies, the neuroelectrically wired
pilot took the simulated aircraft into landing scenarios with a cascading series
of accidents, first locking rudder controls and then progressing to full
hydraulic failure," said Jorgensen. "For each case, successful landings
were demonstrated for autopilot, damaged and propulsion-only control."
Now in 2011
ELECTROENCEPHALOGRAPHIC (EEG)-BASED CONTROL
Scientists say that electrodes integrated into the pilot’s headgear positioned over
specific areas of the scalp can provide the necessary signals to implement
EEG-based control.
This type of control translates the electrical activity of the brain into a
controlsignal. In one approach, EEG patterns are brought under conscious
voluntary control with training and biofeedback.
This approach is not appropriate at this time for the agile aircraft pilot because
of the significant training investment. A more applicable approach harnesses
naturally occurring brain rhythms, patterns, and responses that correspond to
human sensory processing, cognitive activity or motor control.
The most plausible method to date usesbrain responses to modulated stimuli.
These brain responses include components that modulate at the same frequency
as the evoking stimuli.
Thus, if selectable items of a display are modulated at different frequencies, the
pilot’s choice between selectable items can be identified by detecting which frequency
pattern is dominant in the visual evoked brain activity.
The pilot gazes on the desired selection and the controller registers the corresponding
frequency of the displayed item. In effect, through EEG-based control, the advantages
of eye gaze-based control can be realized without expensive and obtrusive hardware
components. Optimization of this head up control requires minimizing the time required
for signal processing, developing easilydonned electrodes, and minimizing the distraction
produced by modulating (flashing) display items. It is this last factor which is key to agile
aircraft operation.
One potential solution is to modulate display items at a sufficiently high frequency such
that the pilot does not perceive the flashing.
Research is underway to investigate whether the brain responses produced by high-frequency
modulated stimuli are adequate for implementing EEG-based control .
The Future is know.
No comments:
Post a Comment