Ohm’s Law and Calibration

Ohm’s Law is a fundamental formula that defines the relationship between voltage (E), current (I) and resistance (R) in an electronic circuit. The Ohm’s Law formula is typically expressed as E = I x R or voltage = current x resistance. In an electronic circuit, Ohm’s Law can be stated as the voltage drop across a resistor is directly proportional to the current flowing through the resistor, and also inversely proportional to the resistance of the resistor.

An easy way to remember the formula and the calculations for any of the variables is with the use of the Ohm’s Law triangle, as shown below.

If any two of the values are known, then the other value can be calculated. For example, if the voltage (E) and current (I) are known and the resistance (R) needs to be calculated, remove the R in the pyramid and the remaining equation is used for the calculation, in this case, R = E / I.

Ohm’s Law in the Calibration Laboratory

Calibration Laboratories use Ohms Law in a number of specific measurement functions, usually for measurements that are outside the ranges of the laboratory’s calibration standards such as with very high or low resistances, or high current values.

Milli-Ohmmeters are instruments that measure low resistances, from typically 1000 ohms down to ranges as low as 5 milliohms. In many standard Digital Multimeters used in a Calibration Laboratory, the resistance ranges only go down to 10 ohms or 100 ohms so they either do not have the accuracy or resolution to measure values significantly below 1 ohm.

Ohms Law is used to characterize standard resistors to provide an accurate resistance measurement to compare with the Milli-Ohmmeter reading. A known current is passed through the resistor and the resulting voltage drop is accurately measured to determine the resistance.

Most standard resistors used for calibrating Milli-Ohmmeters are only available with low power ratings, usually between 1 and 5 watts, so the current used should be as low as possible so that the resistor measurement does not drift from thermal heating of the resistor.

If the current output from the Milli-Ohmmeter being tested is known, then the characterization should be completed at that value. If not, 100 milliamps will work well for most resistance values below 1 ohm, however, 1 Amp may need to be used at values under 10 milliohms.

In these low resistance ranges, the lead resistance of the resistors becomes a significant part of the measurement, so it is important that the characterization and the measurement be performed without removing the leads from the resistor.

Most Digital Multimeters on the market today can only measure up to 10 Amps of current. If higher amperages need to be measured, Current Shunts are used by calibration laboratories to measure the higher amperages generated from power supplies or other power generating equipment.

Current Shunts work by determining the resistance value of the shunt by characterizing them with a known current value. Then the unknown current is calculated by measuring the voltage across the shunt and dividing by the shunt resistance to calculate the current running through the shunt.

The measurement and characterization of Current Shunts are very similar to the calibration of standard resistors as stated above. A typical 100 A/100mV Current Shunt will have a resistance value of roughly 1 milliohm, so the same basic concepts apply in determining the shunt value.

The main difference is that Current Shunts are designed to handle a large amount of current, so when characterizing them, a larger current should be used than is used for the standard resistors. Typically, a known 10 Amp or 20 Amp current is used depending on the range of the shunt and the range of the device being tested.

Usually, the best instruments in a calibration laboratory for generating accurate current values are by the use of larger Multifunction Calibrators that weigh 30+ pounds and can cost over $50,000. Needless to say, these units are not intended to be portable and rarely leave the temperature-controlled environment of the calibration laboratory.

When performing calibrations at a customer’s site, Ohm’s Law can be used to generate many current values when calibrating ammeters or other current measurement devices. By using a Power Supply, a Digital Multimeter to accurately measure the Power Supply output, and a calibrated range of resistors, such as in a decade resistor box, an accurate current value can be generated and used for calibrations.

During troubleshooting, if the electrical measurements are not as they should be, the technicians can use Ohm’s Law to determine what component may be causing the problem. For example, if a voltage at a test point is too low, then either the current or resistance in that part of the circuit has decreased so that can help the technician isolate the proper component.

Should you be calibrating your instruments in-house or outsourced? Read our guide to find out.