North American Network Operators Group

Date Prev | Date Next | Date Index | Thread Index | Author Index | Historical

Re: Providers removing blocks on port 135?

  • From: Iljitsch van Beijnum
  • Date: Sun Sep 21 06:47:01 2003


On zaterdag, sep 20, 2003, at 21:36 Europe/Amsterdam, Sean Donelan wrote:

Should any dialup, dsl, cable, wi-fi, dhcp host be able to use any service
at any time? For example run an SMTP mailer, or leave Network
Neighborhood open for others to browse or install software on their
computers?
As someone who has been using IP for a while now, I would very much like to be able to use any service at any time.

Or should ISPs have a "default deny" on all services, and subscribers need
to call for permitssion if they want to use some new service? Should new
services like Voice over IP, or even the World Wide Web be blocked by
default by service providers?
Obviously not. Blocking services that are known to be bad or vulnerable wouldn't be entirely unreasonable, though. But who gets to decide which services should be blocked? Some services are very dangerous and not very useful, so blocking is a no brainer. Other services are only slightly risky and very useful. Where do we draw the line? Who draws the line?


As a HOST requirement, I think all hosts should be "client-only" by
default. That includes things when acting as like hosts such as routers,
switches, print servers, file servers, UPSes. If a HOST uses a
network protocol for local host processes (e.g. X-Windows, BIFF, Syslog,
DCE, RPC) by default it should not accept network connections.

It should require some action, e.g. the user enabling the service,
DHCP-client enabling it in a profile, clicking things on the LCD display
on the front ofthe printer, etc.
Get yourself a Mac. :-)

I think it would useful to set aside a block of port numbers for local use. These would be easy to filter at the edges of networks but plug and play would still be possible.

SERVICE PROVIDERS do not enforce host requirements.
But someone has to. The trouble is that access to the network has never been considered a liability, except for local ports under 1024. (Have a look at java, for example.) I believe that the only way to solve all this nonsense is to have a mechanism that is preferably outside the host, or at least deep enough inside the system to be protected against application holes and user stupidity, which controls application's access to the network. This must not only be based on application type and user rights (user www gets to run a web server that listens on port 80) but also on application version. So when a vulnerability is found the vulnerable version of the application is automatically blocked.

I don't see something like this popping up over night, though.