Calibration of Timers and Stopwatches

Calibration of Timers and Stopwatches

Time interval, which is defined as the elapsed time between two events can be measured by stopwatches and timers. These devices measure and display the time interval from a given starting point until when they are stopped.

Laboratory timers and stopwatches are essential devices found in most laboratories, research settings, and a whole range of other industrial environments. They are used in monitoring strain rates, establishing drying intervals, determining flow speeds etc. Timing any function is an important part of industrial or laboratory testing. The measurements taken with these timing devices must accurate and verifiable.

Precision laboratory environments require that the timing devices are calibrated at least every six months. The older stopwatches and times had mechanical movements and display the most current electronic digital devices have high reliability and are accurate to 0.005%. Some of these models are controlled by a remotely transmitted radio time signal and can provide even better accuracy. Their time resolution is the smallest time interval that these devices can measure and display.

The standard unit of time is a second. Seconds can be divided into fractions such as milliseconds or microseconds or bigger units such as minutes, hours, and days.

The second is defined based on a property of the stable cesium atom. This is very accurate and therefore cesium oscillators are accepted as primary standards for both time interval and frequency. For a very accurate definition, a second is defined as the time taken by a cesium atom to undergo 9,192.631,770 transitions between its two energy states.


NIST Time Signal Radio Station

You should ensure that whatever method you use to calibrate the timing device it should be traceable to a national standard.

In the United States, the National Institute of Standards and Technology (NIST), is the best reference point for most measurements. NIST provides its own real-time representation of Universal Time Coordinated (UTC) that is available to the public by using a variety of radio, telephone, and internet signals.


Calibration of a Timer or Stopwatch

You can buy a calibrated standard timer device and use it as a reference instrument to calibrate other timer devices in your laboratory. Calibration of all your timing devices can be done in-house or by an accredited outside agency such as e2b calibration.

All stopwatch and timer calibrations are just comparisons between the Device Under Test (DUT) and a measurement reference, or standard. To calibrate a stopwatch or timer, either a time interval standard or a frequency standard is used as a reference for measurement. If you use a time interval standard then it should be compared to the display of the device. If you use frequency standard then it should be compared device’s time base oscillator.

You should ensure that your standard reference is always more accurate that the device you want to calibrate. The following three methods are generally accepted for calibrating a timing device:


  1. The Direct Comparison Method

The direct comparison method is the most common method used to calibrate stopwatches and timers which uses a traceable time-interval as reference. You can get the audio time signals from the telephone or radio. Since you want to measure the time interval and not the current time, any delay of the signal from the source will be unimportant if it just remains constant during the calibration process.

Calibration Procedure for the Direct Comparison Method

  • Tune in your radio to a traceable source of precise time at the top of the hour.
  • At the signal on the hour, start the stopwatch and note down the time.
  • After a suitable time period, listen to the time signal again, and stop the stopwatch on hearing the tone, and note the stopping time.
  • Calculate the difference between the start time and the stop time to get the time interval, and compare this time interval to what is displayed by the stopwatch.
  • Check if the two-time intervals are within the specifications of the device.
  • If not, adjust the device and repeat the above steps.


  1. The Totalize Method

The time interval reference may be determined using a synthesized signal generator, a traceable frequency standard, and a universal counter by the Totalize method. This method removes partially the measurement uncertainty occurring due to the operator’s reaction time.

Calibration Procedure for the Totalize Method

  • Set the counter to Totalize.
  • Take a signal from a standard synthesized signal generator and connect it to the input of the device. Use the laboratory’s primary frequency standard as the external time base for the synthesizer and the counter.
  • Check if the frequency has a period at least one order of magnitude smaller than the resolution of the device under test.
  • Start the stopwatch and manually open the gate of the counter at the same time.
  • After a suitable period of time, simultaneously stop the stopwatch and close the gate of the counter.
  • Compare the two readings. Calculate from the equation ∆t/T to get the results, where ∆t is the difference between the counter and stopwatch displays, and T is the length of the measurement run.
  • If the result is not within an acceptable band adjust the device and repeat the above steps.


  1. The Time Base Method

The time base method is based on frequency measurement. It compares the frequency of the device’s time base oscillator to a calibrated and traceable frequency standard. The time base measurement method is the preferred method for stopwatch and timer calibrations since it offers the lowest amount of uncertainty in the measurement. In this case, the time base of the device is measured directly so the technician’s response time is not a significant factor.


Calibration Procedure for the Time Base Method

For calibrating a stopwatch time base, you may use a commercially available measurement system or you may use a frequency counter with an acoustic pickup.

The reference for a time base calibration is the time base oscillator of the measuring instrument. The frequency that is used must be calibrated and certified to ensure traceability. However, you may also maintain and use a traceable signal of 5 or 10 MHz in the laboratory that you can use as an external time base for the frequency counter and other test equipment. It is not necessary to calibrate the internal time base oscillator if the measurement uncertainty of the external time base is known.


Using a Commercial Time Base Measurement System

Time base measurement systems that are available commercially may be used. The device measures the frequency of the time base oscillator and displays this information as seconds per day or seconds per month. This same function could be performed with a sensor, both for acoustic and inductive pickup and a frequency counter.


Using a Frequency Counter and an Acoustic Pickup

With a frequency counter, you can directly measure the frequency of a stopwatch time base. You can calculate the frequency offset by the reading on the counter display. Adjust the device until the offset is eliminated.

e2b calibration provides ISO 17025: 2017 accredited timer and stopwatch calibration with NIST traceable master testing standards. Our verifiable and traceable services are unmatched in the industry. These calibration procedures let our expert technicians ensure that your devices are calibrated to the highest industry standards. e2b calibration can also provide on-site calibration services. Please contact us for the calibration of your timers and stopwatches.


Is the calibration lab you’re using giving you all the information you need?
Find out here, and whether you’re actually saving any money.


Click the button below to request a free quote from our accredited calibration lab.

© Copyright 2020 e2b calibration