North American Network Operators Group Date Prev | Date Next | Date Index | Thread Index | Author Index | Historical Interface error_rates question
I'm looking for some real user input from network operators: To what degree would you want to be able to measure error_rates on device interfaces. Many tools currently calculate 1% and above (so if you have .9% it is displayed as 0%) but it is quite possible users may want to measure fractional percentages as well. Does anyone have any opinions/preferences based on your current experience? Does it matter the type of interface you are managing (Ethernet, serial, etc.)? Thanks - Dan Holmes
|