Stopwatch and Timer Calibrations - National Institute of Standards ...
Stopwatch and Timer Calibrations - National Institute of Standards ...
Stopwatch and Timer Calibrations - National Institute of Standards ...
- No tags were found...
Create successful ePaper yourself
Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.
Required Uncertainty <br />
Section 8<br />
How to Determine if the Calibration Method Meets the<br />
Required Uncertainty<br />
The preceding sections have provided estimates <strong>of</strong> exp<strong>and</strong>ed measurement<br />
uncertainty associated with various types <strong>of</strong> stopwatch calibrations. It is important<br />
to ensure that the achievable measurement uncertainty is relatively small when<br />
compared to the stopwatch tolerance we are attempting to verify. For example,<br />
if we use the direct comparison method to calibrate a stopwatch with an accuracy<br />
<strong>of</strong> 5 s per day (0.01 s resolution), <strong>and</strong> we decide to use a calibration period <strong>of</strong> 5<br />
minutes, the accuracy <strong>of</strong> the stopwatch over the 5 minute interval would be about<br />
17 ms. The first problem arises in that the resolution <strong>of</strong> the stopwatch is 0.01<br />
s, <strong>and</strong> cannot display 0.017 s, making the specification impossible to verify for<br />
this time period. Furthermore, the measurement uncertainty for the calibration<br />
process is 480 ms, approximately 28 times larger than the accuracy that we are<br />
trying to verify! Clearly a 5 minute calibration period for this stopwatch <strong>and</strong><br />
calibration method is not adequate. Even a 1 hour calibration is inadequate,<br />
because the quantity required to be measured, now about 208 ms, is still less than<br />
half <strong>of</strong> the measurement uncertainty.<br />
As stated in ISO/IEC 17025 [4], when a statement <strong>of</strong> compliance to a<br />
metrological specification is made (for example, is the DUT “in tolerance” or<br />
“out <strong>of</strong> tolerance” based on the manufacturer’s specifications), the laboratory<br />
must take the measurement uncertainty into account. It is the responsibility <strong>of</strong> the<br />
calibration laboratory to decide whether the measurement uncertainty associated<br />
with the calibration method is small enough to comply with the metrological<br />
requirement. Calibration laboratories generally use some sort <strong>of</strong> decision rule<br />
for determining whether the uncertainty associated with the calibration method is<br />
adequate. Examples <strong>of</strong> such rules are the N:1 decision rule (where the tolerance<br />
<strong>of</strong> the instrument to be tested is N times larger than the calibration uncertainty<br />
<strong>and</strong> is mostly commonly stated as 4:1 or 3:1), or some form <strong>of</strong> guardb<strong>and</strong>ing<br />
(where the calibration uncertainty or some fraction <strong>of</strong> it is subtracted from the<br />
DUT accuracy specification <strong>and</strong> the calibration results are acceptable only if<br />
the measured <strong>of</strong>fset was within the guardb<strong>and</strong> limit). Whatever the case, it is<br />
important to compare the calibration measurement uncertainty to the tolerance <strong>of</strong><br />
the DUT, in order to ensure that the calibration process is valid.<br />
57