[56733] in North American Network Operators' Group
Re: route filtering in large networks
daemon@ATHENA.MIT.EDU (Alan Hannan)
Wed Mar 12 23:12:37 2003
Date: Wed, 12 Mar 2003 19:47:09 -0800
From: Alan Hannan <alan@routingloop.com>
To: Andy Dills <andy@xecu.net>
Cc: nanog@merit.edu
In-Reply-To: <Pine.BSF.4.44.0303122151070.75700-100000@thunder.xecu.net>; from andy@xecu.net on Wed, Mar 12, 2003 at 10:22:53PM -0500
Errors-To: owner-nanog-outgoing@merit.edu
> I must not understand something. How would the banana eaters screw up
> applying the same prefix-list outbound to all neighbors?
Humans tend to be imprecise. Scripted actions tend to be very precise.
Implementing a process by which humans manually enter configurations
is prone to error and more difficult to check.
Implementing a scripted, automated process that enters configurations
from a text file or database is more likely to be precise and thorough.
In an anecdotal case, a human going router by router to update ACL 101
is more prone to accidently skip a line in his vi list or his web list,
that guides his manual logins. Another simple error that a human
could make is to accidently mistakenly change cut/paste buffer.
An automated computer program or script is much more likely to be precise.
Notice I use the word "precise" above and not accurate. Humans may be
more accurate in that they are intelligent enough to fix one-off problems.
But when managing many many objects many network folks would value
reliable precision over occasional accuracy.
One can always manually find inaccuracies, and put algorithms for exception
reports, of 'one-off situations' into a precise script.
This is why many folks rightfully argue that change management should
be scripted, and not entrusted to less experienced manual humans.
There's a balance, and you can't have the Olivaws running amuch with
their unintelligent precision. The automated processes must be well
thought and audited by an intelligent, accurate human.
-a