North American Network Operators Group

Date Prev | Date Next | Date Index | Thread Index | Author Index | Historical

Re: Network end users to pull down 2 gigabytes a day, continuously?

  • From: Joe Abley
  • Date: Mon Jan 08 13:21:41 2007



On 8-Jan-2007, at 02:34, Sean Donelan wrote:

On Sun, 7 Jan 2007, Joe Abley wrote:
Setting aside the issue of what particular ISPs today have to pay, the real cost of sending data, best-effort over an existing network which has spare capacity and which is already supported and managed is surely zero.

As long as the additional traffic doesn't exceed the existing capacity.

Indeed.


So perhaps we should expect to see distribution price models whose success depends on that spare (off-peak, whatever) capacity being available being replaced by others which don't.

If that's the case, and assuming the cost benefits of using slack capacity continue to be exploited, the bandwidth metrics mentioned in the original post might be those which assume a periodic utilisation profile, rather than those which just assume that spare bandwidth will be used.

(It's still accounting based on peak; the difference might be that in the second model there really isn't that much of a peak any more, and the effect of that is a bonus window during which existing capacity models will sustain the flood.)

If you limit yourself to the Internet, you exclude a lot of content
being shifted around and consumed in the world. The World Cup or Superbowl are still much bigger events than Internet-only events. Broadcast
television shows with even bottom ratings are still more popular than most Internet content. The Internet is good for narrowcasting, but its
still working on mass audience events.

Ah, but I wasn't comparing internet distribution with cable/satellite/ UHF/whatever -- I was comparing content which is streamed with content which isn't.


The cost differences between those are fairly well understood, I think. Reliable, high-quality streaming media is expensive (ask someone like Akamai for a quote), whereas asynchronous delivery of content (e.g. through BitTorrent trackers) can result in enormous distribution of data with a centralised investment in hardware and network which is demonstrably sustainable by voluntary donations.

"Asynchronous receivers" are more expensive and usually more complicated
than "synchronous receivers."

Well, there's no main-stream, blessed product which does the kind of asynchronous acquisition of content on anything like the scale of digital cable terminals; however, that's not to say that one couldn't be produced for the same cost. I'd guess that most of those digital cable boxes are running linux anyway, which makes it a software problem.


If we're considering a fight between an intelligent network (one which can support good-quality, isochronous streaming video at high data rates from the producer to the consumer) and a stupid one (which concentrates on best-effort distribution of data, asynchronously, with a smarter edge) then absent external constraints regarding copyright, digital rights, etc, I presume we'd expect the stupid network model to win. Eventually.

Not everyone owns a computer or spends a
several hundred dollars for a DVR. If you already own a computer, you might consider it "free."

Since I was comparing two methods of distributing material over the Internet, the availability of a computer is more or less a given. I'm not aware of a noticeable population of broadband users who don't own a computer, for example (apart from those who are broadband users without noticing, e.g. through a digital cable terminal which talks IP to the network).



Joe