[44950] in North American Network Operators' Group
Interface error_rates question
daemon@ATHENA.MIT.EDU (Holmes, Daniel)
Thu Jan 3 08:51:07 2002
Message-ID: <5B628B202658D411AD8700508B7869F8057AB9AF@amt-exc1.aprisma.com>
From: "Holmes, Daniel" <dholmes@aprisma.com>
To: "'nanog@merit.edu'" <nanog@merit.edu>
Date: Thu, 3 Jan 2002 08:55:13 -0500
MIME-Version: 1.0
Content-Type: text/plain;
charset="iso-8859-1"
Errors-To: owner-nanog-outgoing@merit.edu
I'm looking for some real user input from network operators:
To what degree would you want to be able to measure error_rates on device interfaces.
Many tools currently calculate 1% and above (so if you have .9% it is displayed as 0%) but it is quite possible users may want to measure fractional percentages as well.
Does anyone have any opinions/preferences based on your current experience?
Does it matter the type of interface you are managing (Ethernet, serial, etc.)?
Thanks - Dan Holmes