Computer science faces decision theory

By Eric Smalley, Technology Research News

Not only are computers and networks getting ever more complicated, the amount of uncertainty they must contend with is also increasing. A Cornell University computer scientist is proposing that researchers take a page from philosophy and adopt more sophisticated decision theory techniques in order to get a handle on this rise in system complexity.

Decision theory determines the probabilities and utility, or resulting value, of potential outcomes, and uses that information to figure out the best solution for a given problem. Algorithms that use decision theory calculate probabilities and utility for different settings of a system to find the best one.

For example, a system administrator could use decision theory tools to calculate the probabilities of a system crashing at various settings and then weigh those against the utility of the settings as measured by end-user satisfaction in order to to derive the optimal setting.

"The networking community actually has been really good about taking things like utility seriously," said Joseph Y. Halpern, a computer science professor at Cornell. "Most of the rest of computer science hasn't, partly because they haven't had the tools. Standard utility theory assumes you [know] all the probabilities. Typically [researchers and developers] have some information about probabilities but not complete information, so they can't exactly use the off-the-shelf tools" to incorporate utility.

Algorithms that take a more detailed view of utility could improve the performance of complicated systems, especially if they measure utility over time, said Halpern. His research shows that database query optimization, in particular, could be improved a better understanding of utility.

For example, database queries often involve multiple variables, and database management systems have to look up each variable in a separate table. Searching across multiple tables is called joining. "It turns out that the order you join the tables in matters a whole lot," Halpern said. "It doesn't matter for the final answer but it matters in terms of how long it will take you to get that answer. And the differences can be humongous."

Researchers and developers working on database query optimization usually calculate utility only for a typical instance of the system as measured by parameters like available memory, said Halpern. However, looking at the typical situation might not give the answer that works best overall, he said.

"You might have an algorithm that works really well if you've got lots of memory and degrades very badly if you've got very little memory," he said. "You've got another algorithm that works pretty well if you've got lots of memory but doesn't degrade as badly. Now suppose you have a situation where 80 percent of the time you have lots of memory and 20 percent of the time you have just a little bit of memory. If you take plan A, 20 percent of the time it gets hammered. If you take plan B, it doesn't do quite as well 80 percent the time but those 20 percent of the time it does a whole lot better. On average it might do a whole lot better than plan A."

Though decision theory in general is not new to computing, computer scientists are just beginning to apply more sophisticated decision theory techniques, said Michael L. Littman, a principal technical staff member at AT&T Labs-Research. "One recent trend has been to use more and more sophisticated notions of utility and uncertainty and to build systems that reason about utility directly," he said.

One problem with employing more sophisticated decision theory techniques is that measuring utility can be difficult, noted Halpern. And there's only so far you can go. For example, it's virtually impossible to measure utility for end-users of a system because the system's utility for one person is not necessarily the same as for another, he said. Still, researchers and developers can do more to incorporate utility than they have to date, he said.

According to Littman, that is beginning to happen.

"As always, there is a tension between how tricky a problem is and how much time it'll take the computer to solve it," he said. "As computers get faster and algorithms get better, we're seeing a very natural progression towards considering more sophisticated models."

Halpern's most recent paper on the subject, A Decision-Theoretic Approach to Resource Allocation in Wireless Multimedia Networks, was published in the Proceedings of Dial M for Mobility, 2000 conference. The paper was written with Zygmunt Haas, Li Li and Stephen B. Wicker, all of Cornell. Halpern's work on applying decision theory to wireless networks was funded by the National Science Foundation (NSF).

Timeline:   Not applicable
Funding:   Government
TRN Categories:   Data Structures and Algorithms; Series
Story Type:   News
Related Elements:   Three technical papers: "A Decision-Theoretic Approach to Resource Allocation in Wireless Multimedia Networks," "Least Expected Cost Query Optimization: an Exercise in Utility," and "A Decision-Theoretic Approach to Reliable Message Delivery" posted on the Computing Research Repository




Advertisements:



July 26, 2000

Page One

Tiny robots flex their muscles

Optical coax takes take turns

Making a caged ion blink

Simulation acts 7-year-old's age

Computer science faces decision theory




News:

Research News Roundup
Research Watch blog

Features:
View from the High Ground Q&A
How It Works

RSS Feeds:
News  | Blog  | Books 



Ad links:
Buy an ad link

Advertisements:







Ad links: Clear History

Buy an ad link

 
Home     Archive     Resources    Feeds     Offline Publications     Glossary
TRN Finder     Research Dir.    Events Dir.      Researchers     Bookshelf
   Contribute      Under Development     T-shirts etc.     Classifieds
Forum    Comments    Feedback     About TRN


© Copyright Technology Research News, LLC 2000-2006. All rights reserved.