To the uninitiated, the terms accuracy, repeatability and precision might appear to all be pretty mutually exclusive. However, in the world of instrumentation, there is actually a distinct difference between the terms, making it important to understand what each one means and its relation to the others.
Let’s start by looking at their definitions.
Accuracy
The accuracy of instrumentation is determined by the difference of a measured value compared to its actual (true) value. As no measurement is 100% exact an element of inaccuracy needs to be considered, hence the reason why accuracy figures are quoted with ‘±’. Ultimately, accuracy measures how close you come to the correct result. Your accuracy improves when your instruments or tools are calibrated properly.
Precision
Precision is the repeated measurement of a set of values relative to each other, rather than the actual value. Precision improves when using finely incremented tools that require less estimation; better equipment and improved procedures equals better precision.
Repeatability
Repeatability allows you to measure how close a particular result or set of data is compared to the same measurement, made with the same device or instrument, under the exact same circumstances. In other words, the measurement procedure, observer, device or instrument, testing conditions and location would all need to be exactly the same and testing needs to be conducted over a short space of time.
Illustrating the difference
Getting to grips with these definitions and how they affect each other can be quite complicated. To make things easier to understand, consider the simple diagrams below.