Five percent of nodes keep Net together
By
Kimberly Patch,
Technology Research News
Because the Internet is a distributed network
with no central server directing information flow, there are many potential
paths from any given point on the network to any other point. This makes
it a robust network that is difficult to shut down.
The Internet is
also a scale-free, or power-law network,
meaning it harbors a small number of very large hubs with many connections
to other nodes, and a large number of nodes with only a few connections.
This concentration of connections, a trait the Internet shares with large
social and biological networks, makes it more vulnerable to intentional
attack, however, than a network with more evenly distributed node sizes.
Researchers from Bar-Ilan University in Israel and Clarkson University
are examining just how vulnerable the Internet's scale-free nature makes
it. Knowing more about scale-free networks' vulnerabilities may point
the way to both protecting the Internet from attacks and providing better
strategies for attacking biological networks in order to fight disease.
While the Internet is made up of computers that are connected via communications
lines to other computers, a typical biological scale-free network is made
up of the molecules a cell uses. In this case, the network connections
are interactions among molecules. The large hubs in a cell's chemical
communications network include water and the cellular fuel ATP, which
are used in many more reactions then most of the molecules it uses.
The researchers work shows that large scale-free networks are fairly impervious
to random node breakdowns, but if large hubs are targeted methodically,
even large scale-free networks can be broken up into separate islands.
"We've studied the problem mathematically. According to our findings,
while networks like the Internet are resilient to random breakdown of
nodes, they're very sensitive to intentional attack on the highest connectivity
nodes," said Shlomo Havlin, a physics professor at Bar-Ilan University.
This is because a scale-free network's stability depends on the state
of its large hubs, he said.
In scale-free networks as large as the Internet, "there are just enough
high connectivity nodes to keep the network connected under any number
of randomly broken nodes," he said. "A random breakdown of nodes will
leave some... highly connected sites intact, and they will keep a large
portion of the network connected," he said.
An attack that targets about five percent of these highly connected sites,
however, has the capacity to totally collapse the Internet, "very rapidly
[breaking] down the entire network to small, unconnected islands," containing
no more than 100 computers each, Havlin said.
The researchers cannot pinpoint the breakdown threshold any more precisely
than near five percent, Havlin noted, because the exact distribution of
nodes on the Internet can only be roughly estimated.
To find the threshold, the researchers used a branch of mathematics known
as percolation theory, which was originally developed to predict how much
oil can be pumped from a reservoir. "Since oil can only flow through holes
in the ground, this is similar to data flowing through... computers on
the Internet," said Havlin.
Another way to picture percolation theory is to draw a square lattice
of dots on a piece of paper. If you remove a small number of the dots,
you can still connect the rest of the dots around the ones you have removed.
"However, after removing the critical fraction [of dots] there's no continuous
paths from side to side," said Havlin.
In terms of the Internet, "as long as we're above the threshold, there
will be a large connected structure with size proportional to that of
the entire Internet. Below the threshold, there will only be small unconnected
islands of sizes in the dozens [of nodes] each," he said.
The researchers' work offers the theoretical basis for calculating the
threshold for the breakdown of any complicated network, said Albert-László
Barabási, a physics professor at the University of Notre Dame. "By offering
a method to calculate... the number of nodes required to be removed in
order to destroy the network by breaking it into isolated clusters, it
will be of great use [in] fields ranging from Internet research to drug
delivery, where the goal is, [for example,] to destroy some microbes by
gene removal. I expect this result will have a lasting impact on our understanding
of the resilience of complex networks in general," he said.
The researchers' aim is to find ways to design networks that are more
resilient to both random error and intentional breakdown, said Havlin.
The work may also lead to better understanding of network traffic and
virus propagation on the Internet, he said.
Havlin's research colleagues were Reuven Cohen and Keren Erez of Bar-Ilan
University in Israel, and Daniel ben-Avraham of Clarkson University. They
published the research in the April 16, 2001 issue of Physical Review
Letters. The work was funded by the Bar-Ilan University and the Minerva
Center.
Timeline: Now
Funding: Institutional, University
TRN Categories: Networking
Story Type: News
Related Elements: Technical paper, "Breakdown of the Internet
under Intentional Attack," Physical Review Letters, April 16, 2001.
Advertisements:
|
May
23, 2001
Page
One
Laser switch sets
up logic
Light computer
runs quantum algorithm
Five percent
of nodes keep Net together
Prototype
shows electronic paper potential
Lasers spin microscopic
objects
News:
Research News Roundup
Research Watch blog
Features:
View from the High Ground Q&A
How It Works
RSS Feeds:
News | Blog
| Books
Ad links:
Buy an ad link
Advertisements:
|
|
|
|