仪器上「准确度」的二种常见标示
What is the difference of accuracy of a tool stated at Full Scale/Range versus Reading/Setting?
ACCURACY OF A TOOL- Deviation of acceptable limits of a specified standard or tolerance
1. Full Scale/Range-Maximum Deviation is determined by multiplying the maximum reading by the stated full-scale
accuracy
2. Reading/Setting-Maximum Deviation is determined by multiplying thereading or setting by the stated accuracy
Full Scale/Range - Testing of a tool with arange of 10-100
If a torque wrench has a maximum range of 100 with a repeatability of+-5%, an acceptable tolerance range would be
95-105. Using the same tool at 10 an acceptable tolerance range would be 5-15.
Maximum Range multiplied by Stated Full Accuracy
10-100 +- 5% of Full Scale/Range
Acceptable Tolerance
| Preset
| Acceptable Tolerance
| 95
| 100
| 105
| 45
| 50
| 55
| 5
| 10
| 15
|
Reading/Setting - Testing of a tool with arange of 10-100 lbf.in
If a torque wrench has a maximum range of 100 with a repeatability of+-5%, an acceptable tolerance range would
be 95-100. Using the same tool at 10, an acceptable tolerance range would be 9.5-10.5.
Reading/Setting multiplied by Stated Full Accuracy
10-100 +-5% of reading/setting
Acceptable Tolerance
| Preset
| Acceptable Tolerance
| 95
| 100
| 105
| 47.5
| 50
| 52.5
| 9.5
| 10
| 10.5
|
[ 本帖最后由 afeb 于 2008-12-22 17:21 编辑 ] |
|