Light pipes track motion
Technology Research News
Researchers at Duke University have devised
a simple tracking method that promises to dramatically reduce the computing
resources needed for computer vision systems that allow computers and
robots to sense their surroundings.
The technique bridges the gap between full-blown computer vision
systems, which precisely track moving objects but are computer-intensive,
and simple, inexpensive motion detectors, which are much less precise.
Traditional computer vision systems use relatively sophisticated
software and camera equipment; they are also limited to fairly simple
models of physical space-camera relationships, said David Brady, a professor
of electrical and computer engineering at Duke University.
The researchers' method dispenses with the complicated software
and lenses and instead maps the angles of light radiating from a source
by channeling the light through set of pipes onto a set of light detectors.
As an object moves across the field of view, light reflecting from the
object triggers some detectors but not others.
The method relies on a rapid prototyping system, which uses computer-controlled
lasers to harden liquid plastic or fuse powdered metal, to make a custom
set of pipes. The researchers calculate the necessary pipe angles for
a certain task and use the rapid prototyping system to produce the structure.
The researchers made a prototype that monitored a moving light
source at a distance of three meters. The 25.2-millimeter prototype has
eight viewing angles, eight detectors and 36 pipes. Each pipe channels
light from a given angle to a detector. Seven of the eight detectors monitor
four angles and the remaining one monitors all eight. Each of the eight
viewing angles spans five degrees, giving the device a 40-degree field
When an object is in one position within the field of view, for
example, it triggers detectors five, six, seven and eight, and when it
moves to the next position it triggers detectors three, four, six and
eight. A computer controlling the device simply has to know which combination
of triggered detectors corresponds to which position.
In contrast, computer vision systems analyze every pixel in each
digital video frame -- usually 15 to 30 frames per second -- to determine
the borders of objects in the scene. The software tracks motion by comparing
from one frame to the next the position of an object's pixels relative
to background pixels.
At the other end of the scale, motion detectors like those that
turn on backyard lights simply detect motion and don't track the positions
of objects. They detect rapid changes in the intensity of infrared light
hitting first one and then the other of a pair of side-by-side light detectors.
The infrared light is typically produced by the heat of a human body,
and the sequential triggering of the detectors is typically caused by
a person moving across the motion detector's field of view.
The separate angles of the field of view through the researchers'
structure allow for basic digital representations of moving objects, and
the relatively low-tech detector array cuts down the amount of information
a computer must sift through, according to Brady. "These sensors may be
capable of reducing the data load in tracking... systems by several orders
of magnitude," he said.
The lightened computational load could make object tracking much
cheaper. The researchers are working on using the method to track vehicles
and people in real-time, and have produced a prototype that tracks cars
at a distance of 15 meters. "The sensors may also be useful in developing
spatially-aware robots," said Brady.
The researchers are working with commercial partners to develop
simple motion tracking systems using the technology, according to Brady.
The system should be ready for practical use in the next year, he said.
Brady's research colleagues were Prasant Potuluri, Unnikrishnan
Gopinathan and James R. Adelman. The work appeared in the April 21, 2003
issue of Optics Express. The research was funded by the Defense
Advanced Research Projects Agency (DARPA).
Timeline: 1 year
TRN Categories: Computer Vision and Image Processing
Story Type: News
Related Elements: Technical paper, "Lensless Sensor System
Using a Reference Structure," April 21, 2003, Optics Express.
July 2/9, 2003
DNA makes nano barcode
Study reveals Net's parts
Recommenders can skew
Light pipes track motion
bits beat heat
nanotubes in place
Tiny T splits light
Tiny walls sprout
Big sites hoard links
Research News Roundup
Research Watch blog
View from the High Ground Q&A
How It Works
News | Blog
Buy an ad link