[94059] in North American Network Operators' Group
Re: Network end users to pull down 2 gigabytes a day, continuously?
daemon@ATHENA.MIT.EDU (Mark Smith)
Mon Jan 8 05:43:38 2007
Date: Mon, 8 Jan 2007 21:12:16 +1030
From: Mark Smith <nanog@fa1c52f96c54f7450e1ffb215f29991e.nosense.org>
To: Michael.Dillon@btradianz.com
Cc: nanog@merit.edu
In-Reply-To: <OFC3144D41.34E580D1-ON8025725D.00373051-8025725D.00394877@btradianz.com>
Errors-To: owner-nanog@merit.edu
On Mon, 8 Jan 2007 10:25:54 +0000
Michael.Dillon@btradianz.com wrote:
<snip>
>
> I am suggesting that ISP folks should be cooperating with
> P2P software developers. Typically, the developers have a very
> vague understanding of how the network is structured and are
> essentially trying to reverse engineer network capabilities.
> It should not be too difficult to develop P2P clients that
> receive topology hints from their local ISPs. If this results
> in faster or more reliable/predictable downloads, then users
> will choose to use such a client.
>
I'd think TCP's underlying and constant round trip time measurement to
peers could be used for that. I've wondered if P2P protocols did that
fairly recently, however hadn't found the time to see if it was so.
--
"Sheep are slow and tasty, and therefore must remain constantly
alert."
- Bruce Schneier, "Beyond Fear"