Software eases remote robot control

By Chhavi Sachdev , Technology Research News

Remember when every kid had a remote control car, and sometimes parents did, too? Running around the house chasing a tiny car and jamming the joystick controls was a part of growing up. It seems that technology has grown up as well.

A team of researchers at Carnegie Mellon University and the Swiss Federal Institute of Technology at Lausanne has developed software that allows people to control the movement of a robot by using a computer or Personal Digital Assistant (PDA) and the Internet. The set up allows for remote control of a robot from anywhere in the world.

It’s not just a game, though. “Our work is inspired by a wide range of remote vehicle applications, particularly military reconnaissance and planetary exploration,” said Terrence Fong, a research assistant at the Robotics Institute at Carnegie Mellon University. The tools enable a user to understand the remote environment and control the remote vehicle better, Fong said.

Traditionally, even experts have found it difficult to remotely drive robots. The teleoperation tools make driving mobile robots easier because the user and the remote vehicle share control, said Fong. “Our work is centered on a concept called collaborative control, in which the human and the robot engage in dialogue,” said Fong.

Operating remote vehicles using these techniques requires no special training, he said.

The human and the remote robot exchange questions and information, so the robot can decide how to use human advice, following it when it is forthcoming and relevant, and modifying it when the advice is unsafe or inappropriate, he said.

They aren’t talking about a sentient HAL-esque being; the robot still follows a higher-level strategy set by the human. Still, the robot has more freedom in execution, which makes it able to function even when the human operator is unavailable or distracted, according to Fong. This makes the system more dependable, he said.

The PC version of the teleoperation system is dubbed WebDriver and the PDA version, PdaDriver. Both versions are designed to minimize network bandwidth usage. The systems function “even with low-bandwidth, high-delay networks,” said Fong. Both interfaces combine several types of sensory input to give the human operator a composite picture of the robot’s environment.

The system’s input devices, which include a laser scanner, monochrome video camera, stereovision sensor, ultrasonic sonar, and an odometer, complement each other. For example, if the robot is standing directly in front of a smooth, untextured wall with a large plant close by, each sensor will miss something from the scene. With the sonar detecting the plant, the laser scanner following the wall, and stereo vision finding the wall's edge, the sensors take in the whole scene.

The Web version is a Java applet that allows the user to see the status of all five sensors and give the robot specific commands in several different ways. Instead of live video, the image server senses images only when something significant happens, such as if an obstacle appears, said Fong.

It has two primary modes: dynamic map, which shows radar and sonar inputs, and an image manager, which displays and stores data from the pan/tilt camera mounted on the front of the robot. Both modes allow a user to send commands, receive feedback and control the robot’s camera. In image manager mode, the user drives the robot by clicking a series of waypoints and then pressing Go.

A user can also control the robot manually by telling it to, for instance, move forward 5 meters or turn right at 10 degrees per second. The user can do this in situations in which waypoint driving does not work, said Fong. The Web application also supports touchscreen controls, which could allow people to remotely control robots from devices in kiosks, Fong said.

The PDA version has four control modes: command, sensors, video, and map. In command mode a user controls relative position and motion of the robot by clicking on the display’s vertical or horizontal axis. In sensors mode, the user can directly control the robot’s on-board sensors to pan, tilt, and zoom the robot’s camera, enable and disable sonars, and activate motion detection triggers, according to Fong.

Video mode displays images from a robot-mounted camera and map mode displays a sonar map from both robot and global frames of reference. The video and map modes also allow a user to control the robot's movement using waypoint clicking.

PdaDriver is an improved version of WebDriver, said Fong. “PdaDriver allows the user to specify a path... PdaDriver also supports collaborative control, so that the robot can ask questions of the human, [such as,] ‘I seem to be stuck, can you help?’” said Fong.

The researchers are also working on a remote driving system, GestureDriver, which can be used without keyboard or joystick-based interfaces, said Fong. Putting the vision system on the robot allows a user to have a direct visual interaction with the robot, controlling it by hand gestures such as pointing an arm to where the robot should go, he said.

“Hand motions are converted to remote driving commands,” said Fong. A computer vision system tracks the gestures and classifies them using a geometric model that maps the gestures to specific motion commands for the robot, according to Fong.

The researchers concede, however, that visual gestures are not the easiest way to command the robots. Testers reported that it is fatiguing, according to Fong.

They have also been working on a “drive-by-feel interface,” called HapticDriver, which is hand controlled, said Fong. In this system, a “haptic device and robot sensors allow a user to feel the environment, thus avoiding collisions and enabling precision driving” and movements such as docking, he said.

The systems could eventually be used by geologists and astronomers to explore and retrieve samples from remote locations on earth and other planets, Fong said.

“The sensor fusion part is the most sophisticated and interesting,” piece of the research, said Paul Backes, Technical Group leader at NASA’s Jet Propulsion Laboratory. “The paper is a worthwhile collection of the concepts …each of the concepts they discuss seem realistic and valid for some applications,” he added.

Fong's colleagues were Sébastien Grange and Charles Baur of the Swiss Federal Institute of Technology at Laussane, and Charles Thorpe at Carnegie Mellon University. They published the research in the July 2001 issue of Autonomous Robots. The research was funded by the Defense Advanced Research Projects Agency (DARPA), the National Science Foundation (NSF), and Science Applications International Corporation (SAIC, Inc.)

Timeline:  2-3 years
Funding:  Corporate; Government
TRN Categories:  Robotics; Human-Computer Interaction
Story Type:  News
Related Elements:  Technical paper, "Advanced Interfaces for Vehicle Teleoperation: Collaborative Control, Sensor Fusion Displays, And Remote Driving Tools," Autonomous Robots, July 2001.
http://imtsg7.epfl.ch/projects/ati




Advertisements:



August 22/29, 2001

Page One

Nets mimic quantum physics

Teamed filters catch more spam

Software eases remote robot control

Ion beams mold tiny holes

Unusual calms tell of coming storms







News:

Research News Roundup
Research Watch blog

Features:
View from the High Ground Q&A
How It Works

RSS Feeds:
News  | Blog  | Books 



Ad links:
Buy an ad link

Advertisements:







Ad links: Clear History

Buy an ad link

 
Home     Archive     Resources    Feeds     Offline Publications     Glossary
TRN Finder     Research Dir.    Events Dir.      Researchers     Bookshelf
   Contribute      Under Development     T-shirts etc.     Classifieds
Forum    Comments    Feedback     About TRN


© Copyright Technology Research News, LLC 2000-2006. All rights reserved.