North American Network Operators Group

Date Prev | Date Next | Date Index | Thread Index | Author Index | Historical

Calling all researchers (was Re: Vulnerbilities of Interconnection)

  • From: Sean Donelan
  • Date: Sun Sep 15 01:21:18 2002

On Sat, 14 Sep 2002 [email protected] wrote:
> The point being from a structural stand point there is convincing
> empirical evidence that the Internet is not a distributed network.

Due to policy constraints the commercial Internet more closely mimics
a collection of decentralized networks.  However, things are more
complicated because IP permits networks of any size to interconnect
a almost any level over almost any communications technology. We normally
consider it an error when external traffic transits an intranet or
extranet.  But during a disaster, capacity is capacity.

> a small minority has the vast majority of conections.  The applied side
> of this I think I've posted about before, so please pardon the
> redundancy, is that Internet as the AS and router level is very
> resilent to random failures but highly susceptible to targeted
> failures.  This has become so predominant in the literature it has left
> many folks asking questions along the lines of security.

This has come up on this list and others before.  I've previously
commented on some of the published papers.  I'll try not to repeat
everything.  The missing piece from most of the previous papers I've
read is how do you find out how much unused capacity exists? BGP views
and Skitter data is very good at finding used paths, but it tells you
nothing about how much "shadow" capacity exists.

Generally, after a disaster you'll discover new capacity exists in
the Internet which you couldn't detect before the disaster.  Where
does that capacity come from, and why couldn't you detect it before
the disaster? And how much capacity exists in the shadows?

> This has led a scramble to come up with new Internet topology
> generators - another laundry list of references and approaches - but
> the reported security implications have had a bumpier road.  Are all
> these computer scientist and physicists wrong, I'm sure a good case
> could be made against them, but it leaves several open questions.
> There have been several good cases made for the resilence of the statas
> quo and good arguments that there could be problems.  The question is
> how much of it is political.  One side looking for problems they can
> point fingers at in the name of homeland security and one side denying
> any problem at all because it is a lot cheaper if there is no problem.

The research community comes up with a model how the Internet works.
They hypothesize various outcomes based on that model.  The final section
of any research report is a list of problems requiring more research :-)

The operational community looks at the model, goes behind the curtain
and compares the model to their operational networks.  We find the model
doesn't match.  The operational community comes back out from behind
the curtain and tells the research community our network doesn't have
that problem.  And if it did, its fixed now :-)

The research community asks can we please peek behind the curtain?  The
operational community says we respectfully must decline.

Rinse. Repeat.

I probably won't make it to the next NANOG in Eugene, but on Sunday
afternoon NANOG is hosting a research/operation forum to allow researchers
to solicit feedback from the operations community.  The proposal deadline
is Monday, September 16.