North American Network Operators Group

Date Prev | Date Next | Date Index | Thread Index | Author Index | Historical

RE: ARIN Policy on IP-based Web Hosting

  • From: Jason Slagle
  • Date: Thu Aug 31 10:37:30 2000


---
Jason Slagle - CCNA - CCDA
Network Administrator - Toledo Internet Access - Toledo Ohio
- [email protected] - [email protected] - WHOIS JS10172
-----BEGIN GEEK CODE BLOCK-----
Version: 3.12 GE d-- s:+ a-- C++ UL+++ P--- L+++ E- W- N+ o-- K- w---
O M- V PS+ PE+++ Y+ PGP t+ 5 X+ R tv+ b+ DI+ D G e+ h! r++ y+
------END GEEK CODE BLOCK------


On Thu, 31 Aug 2000, Edward S. Marshall wrote:

> HTTP, perchance? The only things missing are a machine-parsable file
> indexing method (which would be easy enough to standardize on if someone
> felt the need to do so; think a "text/directory" MIME type, which would
> benefit more than just HTTP, or use a multipart list of URLs), and
> server-to-server transfers coordinated from your client, which most people
> have disabled anyway for security reasons.

You know, I almost challenged someone to suggest HTTP in my original
posting but decided not to as I didn't think anyone would.

Not all of us enjoy point and clicky interfaces (Even lynx in this context
is point and clicky).

I don't see the ability to implement the functionality of command line FTP
(Yes, I know it COULD be done, but it is a hack at best.  ls = transfer
directory listing.  And what of ASCII and BINARY data types.  ASCII
transfers still have use.)

> But, you get the added benefit of MIME typing

I don't consider this a benefit as I already have enough problems with
HTTP servers having the wrong mime type for .gz.  I'm TRANSFERING a
file.  The mime type is mostly irrelevant there.  What do I need to know
if it's a image/jpeg if I'm just transfering it to my local drive.

The MIME type is only beneficial if your attempting to do something with
the file after receipt.  If I was I'd be using HTTP or another
protocol.  I'm not, I'm transfering it.

> human-beneficial markup

Once again, of no use.  I don't want thumbnails if I ls -l a directory of
jpg's.

> caching if you have a nearby cache

Theres no reason I can't cache it now.  Squid manages to (Granted it's
taking ftp:// URL's, but you could hack up a "real" ftp proxy to cache.

> inherent firewall friendliness

Point taken here.

> simple negotiation of encrypted transfers (SSL)

Here also.

> And for command-line people like myself, there's lynx, w3m, and wget.

While I've never used w3m, lynx and wget lack the functionality of just
about any FTP client.

When I want to choose from a list of files and "click" on the one I want I
already use lynx with ftp:// url's.  But, often I don't want to do that as
I'm transfering multiple files.

> FTP is disturbingly behind on features, some of which (decent certificate
> authentication, full-transaction encryption, data type labelling, and
> cache usefulness) are becoming more important today. Either the FTP RFC
> needs a near-complete overhaul, or the HTTP and other RFCs need to be
> updated to include the missing functionality.

I'm not arguing that it isn't.  I'm just saying that until a NEW protocol
comes out, or someone overhauls the existing FTP protocol, you can't scrap
it as nothing I have found duplicates the functionality I want in an FTP
client.  wget comes close for http, but I have to know what I want
beforehand.

The solution (As several people have emailed me to say) may be sftp or scp
with encryption and compression off when not needed (Which I confirmed is
nearly as fast as FTP).

I'd be willing to work on hammering out a new and "improved" FTP protocol
if several others are interested.

Jason