[94405] in North American Network Operators' Group

home help back first fref pref prev next nref lref last post

Re: Network end users to pull down 2 gigabytes a day, continuously?

daemon@ATHENA.MIT.EDU (Joe Abley)
Sun Jan 21 15:48:56 2007

In-Reply-To: <007501c73d94$5e4f06c0$6801a8c0@atlanta.polycom.com>
Cc: "North American Noise and Off-topic Gripes" <nanog@merit.edu>
From: Joe Abley <jabley@ca.afilias.info>
Date: Sun, 21 Jan 2007 15:40:04 -0500
To: Stephen Sprunk <stephen@sprunk.org>
Errors-To: owner-nanog@merit.edu



On 21-Jan-2007, at 14:07, Stephen Sprunk wrote:

> Every torrent indexing site I'm aware of has RSS feeds for newly- 
> added torrents, categorized many different ways.  Any ISP that  
> wanted to set up such a service could do so _today_ with _existing_  
> tools.  All that's missing is the budget and a go-ahead from the  
> lawyers.

Yes, I know.

>> If anybody has tried this, I'd be interested to hear whether on- 
>> net clients actually take advantage of the local monster seed, or  
>> whether they persist in pulling data from elsewhere.
>
> [...] Do I have hard data?  No. [...]

So, has anybody actually tried this?

Speculating about how clients might behave is easy, but real  
experience is more interesting.


Joe


home help back first fref pref prev next nref lref last post