North American Network Operators Group

Date Prev | Date Next | Date Index | Thread Index | Author Index | Historical

Re: Can P2P applications learn to play fair on networks?

  • From: Sean Donelan
  • Date: Fri Oct 26 01:00:19 2007


On Thu, 25 Oct 2007, Marshall Eubanks wrote:
I don't follow this, on a statistical average. This is P2P, right ? So if I send you a piece
of a file this will go out my door once, and in your door once, after a certain (& finite !) number of hops
(i.e., transmissions to and from other peers).


So if usage is limited to each customer, isn't upstream and downstream
demand also going to be limited, roughly to
no more than the usage times the number of hops ? This may be large, but it won't be unlimited.

Is the size of a USENET feed limited by how fast people can read?


If there isn't a reason for people/computers to be efficient, they
don't seem to be very efficient.  There seems to be a lot of repetious
transfers and transfers much larger than any human could view, listen
or read in a lifetime.

But again, that isn't the problem. Network operators like people who pay to do stuff they don't need.

The problem is sharing network capacity between all the users of the network, so a few users/applications don't greatly impact all the other users/applications. I still doubt any network operator would care if 5% of the users consumed 5% of the network capacity 24x7x365. Network operators don't care as much even when 5% of the users consumer 100% of the network capacity when there is no other demand for network capacity. Networks operators get concerned when 5% of the users consume 95% of the network capacity and the other 95% of the users complain about long delays, timeouts, stuff not working.

When 5% of the users don't play nicely with the rest of the 95% of
the users; how can network operators manage the network so every user
receives a fair share of the network capacity?