Temperature Loggers: Accuracy versus Resolution

Resolution

The resolution is an indication of how specific the answer is. It is measured in degrees. The smaller the value, the better the device. For example, a resolution of 0.5° means that a reading of 3.5° could be between 3.25° and 3.75°. A resolution of 0.1° means that a reading of 3.5° could be between 3.45° and 3.55°.

The maximum permissible resolution depends upon the application. For example, when monitoring body temperature, at least 0.1° resolution is critical. For food, this degree of resolution is often not necessary. The temperature change that has to be recorded is a couple of degrees and a resolution of 0.5° is often acceptable.

 

Accuracy

The accuracy indicates how close to the actual temperature reading is. The smaller the number, the better the device. The current HACCP requirement in the food industry for accuracy is typically 1°C.

 

Accuracy vs Resolution

The difference between accuracy and resolution can be seen in this simple analogy.

Two people have a watch. One watch shows the hours, minutes and seconds. It has a resolution of 1 second. The other watch only has hours and minutes. It has a resolution of 1 minute.

The first person sets their watch once a month and can be up to 5 minutes fast or slow. It has an accuracy of +/- 5 minutes. The second person sets their watch daily with an accuracy of 1 minute.

In this example, the first person could state the time as “1:34 PM and 23 seconds” but could be up to 5 minutes out. Their answer is specific (good resolution) but inaccurate. The second person can only say the time is “1:36 PM” but will be within one minute of the time. Their answer has a lower resolution but is more accurate.

Many thermometers on the market will display the temperature to one or two decimal places but have an accuracy of greater than 1°. Do not mistake decimal places for accuracy. If the accuracy is not stated on the product then assume it is greater than 1°.