[100445] in North American Network Operators' Group

home help back first fref pref prev next nref lref last post

Re: Can P2P applications learn to play fair on networks?

daemon@ATHENA.MIT.EDU (Sean Donelan)
Tue Oct 23 13:45:04 2007

Date: Tue, 23 Oct 2007 13:43:30 -0400 (EDT)
From: Sean Donelan <sean@donelan.com>
To: Iljitsch van Beijnum <iljitsch@muada.com>
cc: nanog@merit.edu
In-Reply-To: <21897724-9A0B-419A-A6B4-2633DDD5A027@muada.com>
Errors-To: owner-nanog@merit.edu


On Tue, 23 Oct 2007, Iljitsch van Beijnum wrote:
> The problem here is that they seem to be using a sledge hammer: BitTorrent is 
> essentially left dead in the water. And they deny doing anything, to boot.
>
> A reasonable approach would be to throttle the offending applications to make 
> them fit inside the maximum reasonable traffic envelope.

There are many "reasonable" things providers could do.

However, in the US  last year we had folks testifying to Congress that QOS 
will never work, providers must never treat any traffic differently, DPI
is evil, and the answer to all our problems is just more bandwidth. 
Unfortnately, its currently not considered acceptable for commercial ISPs 
to do the same things that universities are already doing to manage 
traffic on their networks.

The result is network engineering by politician, and many reasonable 
things can no longer be done.

Fair usage policies
QOS scavenger/background class of service
Tiered data caps billing
Upstream/downstream billing

Changing some of the billing methods could encourage US providers to offer 
"uncapped" line rates, but "capped" data usage.  So you could have a 
20Mbps/50Mbps/100Mbps line rate, but because the upstream network 
utilization could be controlled at the data layer instead of the line 
rate, effective prices may be lower.

But I don't know if the bloggersphere is ready for that yet in the US.

home help back first fref pref prev next nref lref last post