Choosing a Food Thermometer
Food thermometers are used to test potentially hazardous foods during storing, cooking and/or transporting in order to minimise the growth of bacteria. Temperature control is the easiest and most effective way to minimise and control possible toxins. Any foods containing raw or cooked meat, dairy products, seafood, cooked rice or pasta and processed fruit and vegetables are considered potentially hazardous.
The primary thermometry concepts to take into account when choosing a food thermometer are accuracy, resolution, and reproducibility. Secondary factors to consider include range, speed, durability and the environment in which the unit will be used. Subsequently, it is important to determine the appropriate type of thermometer required as they are made to measure different types of physical characteristics.
Any slight increase or decrease in temperature can have a profound effect upon the growth of bacteria. Electronic thermometers with digital displays make it easy to measure temperature within a tenth of degree or less. There are valuable features in today’s thermometers that allow the user to view, record and manipulate the measurements taken.
A food thermometer is a crucial instrument within any food safety program. Food thermometers come in all shapes and sizes with various usage procedures and additional features. It is important to evaluate each situation to establish which type of thermometer would be most beneficial based on the desired outcome and results.
Primary Thermometry Concepts
Reproducibility, accuracy and resolution, are the foundations upon which all good thermometer technology is built. Each component is equally integral to the overall quality and efficiency of the unit. For example, a thermometer may be accurate with a high resolution but without the assurance of reproducibility the readings are unreliable.
Employing a universal scale (whether Fahrenheit, Celsius, Kelvin, Rankine or other more obscure scales) makes the establishment of scientific standards achievable. It allows the direct comparison of relative temperature data from place to place and instrument to instrument. For example, a thermometer measuring the ice point of water should read 0°C consistently, otherwise it would make any universal scale adopted meaningless in comparing the relative temperatures of dissimilar materials and environments.
The condition hysteresis is a common challenge to the reproducibility of thermometers. With hysteresis, the physical properties of an instrument are temporarily changed by the process of taking a measurement. It becomes apparent when measuring the same material (eg, ice bath) produces various results. Although more common in bi-metal thermometers, electronic thermometers can also be affected. An extended resting period to allow the physical properties of the instrument to return to normal can sometimes restore accuracy, but often this is only a temporary solution. In accordance with Australian Standards and general best practice it is important that your thermometers are calibrated regularly.
Calibration provides certainty in knowing the true accuracy and measurement traceability of your instrumentation. At Ross Brown Sales, our in-house Temperature Measurement Laboratory can provide a Workshop Calibration Certificate that is carried out using NATA certified reference equipment.
“Drift” is the potential for instruments to lose accuracy over time. Although unavoidable regardless of thermometer type, it is easily counteracted through regular re-calibrations. For electronic thermometers, general best practice suggests calibrations are carried out every 12 months. As technology advances, we depend on more accurate and precise temperature measurements. Electronic thermometers with their own computer circuitry can achieve better results by factoring in such things as the effect of ambient temperature, however separating the temperature sensor (probe) from the temperature calculator (meter) increases the possibility of error and both components need to be regularly calibrated.
Thermometer resolution refers to the smallest increment of measurement on an instrument. A thermometer that displays temperature readings with a tenth of a degree resolution means that it will read to the nearest 0.1°C (eg 46.6°C) whereas an instrument with a hundredth of a degree resolution conveys a greater measurement display capability (eg 36.26°C).
Although resolution and accuracy are two separate elements, the two should be thought of as going hand in hand. A thermometer with ±0.05 accuracy would be ineffective if the resolution was only in tenths of a degree. Likewise, it could be misleading for a thermometer with hundredths of a degree resolution to only measure ±1° accuracy.
Some thermometers have an “auto-ranging” feature where the resolution adjusts when measuring above or below a certain temperature. For example, the ETI ThermaQ® Thermometer has a temperature range of -99.9°C to 1372°C and its’ resolution is 0.1°C up to 299°C and then 1°C thereafter (300° to 1372°C).
Secondary Thermometry Concepts
Building upon the foundation of thermometer technology, Range, Speed and Type are important factors to consider when choosing a thermometer. Different thermometer technologies are more effective in certain situations based on what is being tested. For example, when testing the “doneness” of meat, an infrared thermometer will only give you a reading on surface temperature so a probe thermometer would be more applicable. Depending on the type of meat, a more accurate thermometer may be needed, this may require an instrument with a smaller temperature range in order to obtain a more precise reading.
Range describes the upper and lower limits of a thermometer’s measurement scale. Some thermometers have a broad temperature range whereas others specialise within a particular environment (ie fridge/freezer) and will provide a more economical solution. Often a thermometer will have different accuracy and/or resolution specifications within their temperature range. For example, the new Thermapen Blue has an accuracy reading of ±0.4 between the temperature range of -49.9°C to 199.9°C, then ±1 thereafter. It is essential to read specification tables carefully, especially in cases where the probe and meter are separate.
Speed (aka Response Time) is an influential aspect when choosing a thermometer. Response time is affected by many factors such as; the sensor’s position relative to the substance being measured, the mass of the sensor itself, the speed of the processor, the length of the wiring between the sensor and the processor, and the type of technology used.
Some thermometer technologies are faster than others, for example, generally electronic thermometers are faster than mechanical thermometers (ie liquid mercury or dial thermometers) and thermocouple sensors are faster than resistance technologies (ie thermistor or RTD). Additionally, the closer sensor position and smaller mass of the reduced tip probes facilitates faster responsiveness than the standard-diameter probes.
In technical catalogues and websites, response time is often listed in increments called “time constants”. One time constant is the time it takes for a given instrument to reach 63% of the full reading. In order to obtain an accurate 100% practical equivalent, four more time constants are required (five time constants in total).
It is beneficial to determine if the advertised thermometer’s speed is measured per technical response time or a full-reading claim. For example, the ETI SuperFast Thermapen has a technical response time of 0.6 seconds, producing a full-reading response time of 3 seconds. Technical response times can be misleading being that the proclaimed response time of 3 seconds will in fact be a 15 second full-reading claim.
Reading Update Rate
The Reading Update Rate refers only to the frequency with which the digital processor of a thermometer samples the sensor. For example, the ETI SuperFast Thermapen has an update rate of 0.5 seconds which means the digital display will show changes in the temperature as measured by the sensor every half second. This number can be misleading as it has nothing to do with the speed with which the sensor will adjust to the temperature of the material being measured.
It is important to note that the real response time of a thermometer can vary depending on the particular substance and/or range of temperatures being measured. Specification tables give outside limits, not exact speeds.
The total response time may also be the collective of each individual component, ie, the meter response time plus the probe response time. Integrated systems like the SuperFast Thermapen and Food Check are often favoured because their listed response times are composite.
The five most common thermometer technologies are:
- Liquid expansion devices
- Bi-metallic devices
- Resistance temperature devices (RTD) and thermistors
- Infrared radiation devices
Bi-metals are mechanical thermometers that have a dial display. The dial is connected to a spring coil at the centre of the probe. The spring is made of two different types of metal that expand in different (but predictable) ways when exposed to heat. Heat expands the spring and pushes the needle on the dial. Despite being cheap, bi-metal thermometers generally take minutes to reach full temperature and require the entire metal coil to be immersed in the material being measured to get an accurate reading (usually more than an inch or two). They lose calibration very easily and need to be re-calibrated weekly or even daily using a simple screw that rewinds the metal coil.
Resistance Temperature devices (RTD’s and thermistors) measure the effects of heat on electric current. They take advantage of the fact that electrical resistance reacts to the changes in temperature along predictable curves. With thermistors, resistance decreases with temperature whereas resistance increases with temperature for RTD’s.
Commonly using platinum or metal films, RTD’s are very accurate and have a high repeatability. Thermistor elements are highly sensitive and commonly use ceramic beads as resistors. Thermistors are inexpensive and reliable but are not built for high temperatures.
Thermocouples work on the principle that when two different metals are connected across a span with a temperature difference, an electronic circuit is generated. A predictable voltage is generated within the thermocouple when the ends are maintained at different temperatures. The Hot Zone is the temperature measured by two common metals welded together at the tip of the thermometer probe. These common materials include nickel and chromium (Type K), copper and constantan (Type T) or iron and constantan (Type J). The thermocouple’s Cold Zone (reference point) where the two metal wires are open, is measured through either a cold junction (part of the circuit is brought to the ice point [0°C/32°F]) or an electronic cold junction compensation.
Thermocouples can detect temperatures across wide ranges and are typically very fast. They can be an all-in-one choice because they use interchangeable probes for different applications.
- Max/Min feature is particularly useful when determining if a target has been kept within the designated temperature boundaries over an extended period of time. The Max/Min functionality displays the highest and lowest temperatures encountered and allows the user to record and monitor the results. Please note: the electronic instruments with this feature do not have an auto-off feature as this would reset the Max/Min recordings.
- Hold allows you to freeze a displayed measurement (usually a digital reading) for later consultation.
- Differential Recordings (Diff) displays the range of deviation over a span of time by subtracting the minimum from the maximum recorded temperatures.
- Average temperature recordings (Avg) simply displays the calculated average temperature from all the individual measurements ascertained over a span of time.
- Hi/Lo feature enables you to predetermine a “safe” temperature range. The alarm is triggered when a measurement has gone above or below this preset temperature range. The alert can be in the form of a blinking light, beeping sound, email or text message.
- Auto-off is a feature designed to protect long-term battery life by automatically shutting the unit off after a set amount of time (some products will facilitate the option to disable this feature).