A multimeter, also known as a volt-ohm meter, is a handheld tester used to measure electrical voltage, current (amperage), resistance, and other values. Multimeters come in analog and digital versions and are useful for everything from simple tests, like measuring battery voltage, to detecting faults and complex diagnostics. They are one of the tools preferred by electricians for troubleshooting electrical problems on motors, appliances, circuits, power supplies, and wiring systems. DIYers also can learn to use multimeters for basic measurements around the house.
An analog multimeter is based on a microammeter (a device that measures amperage, or current) and has a needle that moves over a graduated scale. Analog multimeters are less expensive than their digital counterparts but can be difficult for some users to read accurately. Also, they must be handled carefully and can be damaged if they are dropped.
Analog multimeters typically are not as accurate as digital meters when used as a voltmeter. However, analog multimeters are great for detecting slow voltage changes because you can watch the needle moving over the scale. Analog testers are exceptional when set as ammeters, due to their low resistance and high sensitivity, with scales down to 50µA (50 microamperes).
Digital multimeters are the most commonly available type and include simple versions as well as advanced designs for electronics engineers. In place of the moving needle and scale found on analog meters, digital meters provide readings on an LCD screen. They tend to cost more than analog multimeters, but the price difference is minimal among basic versions. Advanced testers are much more expensive.
Digital multimeters typically are better than analog in the voltmeter function, due to the higher resistance of digital. But for most users, the primary advantage of digital testers is the easy-to-read and highly accurate digital readout.
Using a Multimeter
The basic functions and operations of a multimeter are similar for both digital and analog testers. The tester has two leads—red and black—and three ports. The black lead plugs into the "common" port. The red lead plugs into either of the other ports, depending on the desired function.
After plugging in the leads, you turn the knob in the center of the tester to select the function and appropriate range for the specific test. For example, when the knob is set to "20V DC," the tester will detect DC (direct current) voltage up to 20 volts. To measure smaller voltages, you would set the knob to the 2V or 200mV range.
To take a reading, you touch the bare metal pointed end of each lead to one of the terminals or wires to be tested. The voltage (or other value) will read out on the tester. Multimeters are safe to use on energized circuits and equipment, provided the voltage or current does not exceed the maximum rating of the tester. Also, you must be careful never to touch the bare metal ends of the tester leads during an energized test because you can receive an electrical shock.
Multimeters are capable of many different readings, depending on the model. Basic testers measure voltage, amperage, and resistance and can be used to check continuity, a simple test to verify a complete circuit. More advanced multimeters may test for all of the following values:
- AC (alternating current) voltage and amperage
- DC (direct current) voltage and amperage
- Resistance (ohms)
- Capacity (farads)
- Conductance (siemens)
- Duty cycle
- Frequency (Hz)
- Inductance (henrys)
- Temperature Celsius or Fahrenheit
Accessories or special sensors can be attached to some multimeters for additional readings, such as:
- Light level
- Wind speed
- Relative humidity