North American Network Operators Group

Date Prev | Date Next | Date Index | Thread Index | Author Index | Historical

Re: Network end users to pull down 2 gigabytes a day, continuously?

  • From: Joe Abley
  • Date: Sun Jan 21 11:58:15 2007



On 21-Jan-2007, at 07:14, Alexander Harrowell wrote:

Regarding your first point, it's really surprising that existing P2P applications don't include topology awareness. After all, the underlying TCP already has mechanisms to perceive the relative nearness of a network entity - counting hops or round-trip latency. Imagine a BT-like client that searches for available torrents, and records the round-trip time to each host it contacts. These it places in a lookup table and picks the fastest responders to initiate the data transfer. Those are likely to be the closest, if not in distance then topologically, and the ones with the most bandwidth. Further, imagine that it caches the search - so when you next seek a file, it checks for it first on the hosts nearest to it in its "routing table", stepping down progressively if it's not there. It's a form of local-pref.

Remember though that the dynamics of the system need to assume that individual clients will be selfish, and even though it might be in the interests of the network as a whole to choose local peers, if you can get faster *throughput* (not round-trip response) from a remote peer, it's a necessary assumption that the peer will do so.


Protocols need to be designed such that a client is rewarded in faster downloads for uploading in a fashion that best benefits the swarm.

The third step is for content producers to directly add their torrents
to the ISP peers before releasing the torrent directly to the public.
This gets "official" content pre-positioned for efficient distribution,
making it perform better (from a user's perspective) than pirated
content.

If there was a big fast server in every ISP with a monstrous pile of disk which retrieved torrents automatically from a selection of popular RSS feeds, which kept seeding torrents for as long as there was interest and/or disk, and which had some rate shaping installed on the host such that traffic that wasn't on-net (e.g. to/from customers) or free (e.g. to/from peers) was rate-crippled, how far would that go to emulating this behaviour with existing live torrents? Speaking from a technical perspective only, and ignoring the legal minefield.


If anybody has tried this, I'd be interested to hear whether on-net clients actually take advantage of the local monster seed, or whether they persist in pulling data from elsewhere.


Joe