Mark-10 defines accuracy as a percentage of full scale of the instrument. To determine the measurement error as an actual load value, multiply the accuracy percentage by the instrument’s capacity.

Example 1 – M5-50 force gauge:
The accuracy is ±0.1% of full scale (FS). Multiply ±0.1% by 50 lbF, which equals ±0.05 lbF. This means that any displayed reading may be higher or lower by up to 0.05 lbF. For example, if the displayed value is 30.00 lbF, the true reading will be ≥29.95 lbF and ≤30.05 lbF.

Example 2 – Plug & TestTM indicators and sensors:
The accuracies of the sensor and the indicator must be added together. Models 7i and 5i indicators have accuracy values of ±0.1% FS, while the Model 3i is rated at ±0.2% FS. Using the example of a Series R50 torque sensor with Model 3i indicator, add ±0.35% to ±0.2%, which equals ±0.55%. In a specific example for the Model MR50-12, the accuracy becomes ±0.55% x 135 Ncm = ±0.7425 Ncm.

Percentage of Reading:
Because of these fixed errors, lower measured values will be more inaccurate as a percentage of reading.

Further using the example of an M5-50 force gauge, a fixed error of ±0.05 lbF represents a higher error as a percentage of reading for a load of 1.00 lbF than 30.00 lbF.

To calculate the error as a percentage of reading, divide the fixed error by the measured value. For a 1.00 lbF load, the fixed error equals ±0.05 ÷ 1.00 lbF = ±5% of reading. For a 30.00 lbF load, the fixed error equals ±0.05 ÷ 30.00 lbF = ±0.17% of reading.

Because of the relationship between load and accuracy, we recommend selecting an instrument capacity as close as possible to the maximum measured load.