Measuring Milliohms with a Multimeter
The circuit consists of little more than a 6 V voltage regulator and a mains adapter capable of supplying around 300 mA at 9 to 12 V.
The circuit supplies a fixed current output of 100 mA or 10 mA selected by switch S1. This connects either the 60 Ω or 600 Ω resistor into the constant current generator circuit. The resistor values are produced by paralleling two identical resistors; 120 Ω and 1.2 kΩ from the E12 standard resistor range. Two test leadswith probes are used to deliver current to the test resistance. The resultant voltage drop is measured by the multimeter (M1). With the test current set to100 mA a measurement of 1 mV indicates a resistance of 10 mΩ. At 10 mA (with S1 in the position shown in the diagram) a measurement of 1 mV indicates a resistance of 100 mΩ while 0.1 mV is equal to 1 mΩ. Diode D1 protects the meter from too high an input voltage.
With the voltmeter connected as shown in the diagram it measures not only the voltage drop across RX but also that produced by the resistance of the test leads, and probes. To make a true measurement, first touch the probes close together on the same lead of the test resistance and note the reading, now place the probes across the test resistance and note the reading again. The first reading measures just the test leads and probes while the second includes the resistance RX. Subtract the first measurement from the second to get the value of RX.
The accuracy of the measurements are influenced by the contact resistance of switch S1, the precision of resistors R1 to R4, the 6 V supply level and of course the accuracy of the measuring voltmeter.
For optimum decoupling C1 should be fitted as close as possible to pin1 of IC1. An additional electrolytic capacitor of around 500 µF can be used at the input to the circuit if the input voltage from the AC power adapter exhibits excessive ripple. (Author: Klaus Bertholdt, published in Elektor Magazine (Germany)