Making Electrical Measurements Part 2: Loading
When making a measurement with a volt-meter, an oscilloscope, or any type of electronic measurement equipment, it is important to understand the concept of loading if you want to be sure your readings are accurate.
For example, suppose I use a volt-meter to measure the DC voltage at the output of a voltage-divider as shown in Figure 1, and I get a reading of 4 Volts. Assuming my meter is working properly, am I sure it's a good reading? Well, that depends on two things: the values of the resistors in the circuit, and the input impedance of the meter. In order to see what's going on, I need to look at the Thevenin's Equivalent Circuit for the voltage divider.
If you're not familiar with a Thevenin's Equivalent, it can be found in any book on circuit analysis. Basically, it's a single voltage (Vth) in series with a single resistor (Rth) as shown in Figure 2. The voltage (Thevenin's Voltage) is what you would measure with a perfect volt-meter. The resistance (Thevenin's Resistance) is found from R = E/I where E is the Thevenin's Voltage and I is the current you would get if you were to short-circuit the output to ground.
For the voltage divider I'm trying to measure, since both resistors are equal, Vth would be V/2 and Rth would be R/2. Figure 3 shows my meter as a resistor connected to the Thevenin's Equivalent of the voltage divider. Note that the input impedance of the meter looks like a resistor forming another divider. So the voltage across the leads of my meter is not Vth as you might expect, but is a value I can calculate as:
Rin Vm = ----------- x Vth Rin + Rth
Now suppose that V is 12 Volts and R is 2k Ohms. Then Vth will be 6 Volts and Rth will be 1k Ohm. Suppose that Rin of the meter is 10 Meg-Ohms. Using the above equation I get:
10,000k Vm = -------------- x 6 Volts = 5.9994 Volts 10,000k + 1k
Which, on a typical 3-digit meter, will read 6.00 volts. No problem since that's the right reading.
But what if R in the divider is 2 Meg-Ohms. Then Rth is 1 Meg-Ohm and the equation will give:
10,000k Vm = -------------- x 6 Volts = 5.4545 Volts 11,000k
Which, on a typical 3-digit meter, will read 5.45 Volts. Now I have a problem. The reading is wrong because the meter loaded-down the circuit I was trying to measure. If R was 20 Meg-Ohms it would be even worse!
What can I do to solve the problem? A few things. First, I can see if I can get a meter with a higher input impedance. Second, I can use a X10 probe if there is one for my meter (see Tech Tip on X10 probes). If all else fails, I can use a little math. If I know the input impedance of my meter and the Rth of the circuit I'm trying to measure, then I can correct my readings as follows:
Rin + Rth True Voltage = Measured Voltage x ------------ Rin
If you are using a digital multimeter to measure voltage, then the input impedance is typically high (say, 10 Meg), and the same value for all input ranges. But if you are using an old-fashioned VOM, then the input impedance depends on the range the meter is set to. For instance, if the VOM is rated at 10k Ohms per volt and is on the 0 - 50 Volt range, then Rin is 10k x 50 or 500k Ohms. But on the 0 - 5 Volt range Rin will only be 10k x 5 or 50k Ohms. Typical ratings for VOMs are 1 k Ohm per Volt at the low end to 20k Ohms per Volt at the high end.
But what if I don't know the input impedance of my meter, or if there is no way to calculate Rth. Can I find out if I have a loading problem? Yes, by running a little test. Measure the voltage with your meter. Then put a 100 K resistor in series with the red lead of the meter and measure the voltage again. If the readings change significantly, then you may have a problem.
So know the input impedances of all your measurement equipment. You'll find it on the specifications page in your user's manual. And have some idea of the internal resistances in the circuits you are measuring. Then you won't be fooled by loading. In later technical tips we will look at other factors that affect the accuracy of your measurements.