A multimeter is an electronic calibration device thatcan measure multiple different units, such as voltage, current, and resistance. Multimeters are also known as multitesters, and the larger models, which are more sensitive and accurate, can sometimes be used to calibrate other items. Not only do multimeters range in types of measurement they can perform, they also range in size from hand-held devices to larger bench-top models—the smaller the device the less sensitive it is. Depending on the application at hand, one of two kinds of multimeters is appropriate: analog or digital.
Digital multimeters are more modern and more accurate than analog multimeters and carry several distinguishing characteristics. They display data, or output, as a digital numerical feed, which means they can be highly specific in their results and are easy to read. Of course, depending on the type and unit of measurement (voltage, current, or resistance) the manner in which a digital multimeter processes signals will be different. The correct function can be selected on a dial on the front of the multimeter. In hand-held systems, this is easily adjusted to select a particular type of measurement. Additionally, the dial enables the user to select the range of measurement by deciding upon the placement of the decimal point. In turn, this decision determines how precise the resulting reading will be. The reading’s precision is also known as resolution.
The range of measurement will vary according to different units of measurement—for example, the range for monitoring voltage is different than the range for monitoring current or resistance. Generally speaking, the lower the range, the more precise the reading. However, if the range is set too low and the voltage is actually higher, the range will need to be adjusted. To change the range, simply move the decimal point over or back a space. (A range set with the highest reading of 15V would become 150V.)
Of course, using a multimeter to test voltage is different than using a multimeter to test current. Each type of measurement involves a different process for gaining accurate results. When it comes to voltage, it’s important to determine whether the test is for AC or DC voltage: this will influence how you set the dial before beginning testing. If testing AC voltage, the dial should correspond with the “V_” setting. If testing for DC voltage, the dial should correspond with the “V” setting. Determine the appropriate range, setting the range slightly higher than the expected result to ensure the data doesn’t overload the multimeter. AC voltage testing often yields a fluctuating result, but the data will settle as the reading is taken. Be sure to consult a manual before connecting test leads to the circuit.
If you are using a multimeter to test current, it is helpful to apply a clamp meter or a clamp meter adapter. With a clamp meter, simply install the head of the meter around a conductor. Once the head is completely closed, take the measurement. With a current clamp adapter, the measurement will be converted into voltage, so set the function as if conducting an AC voltage test and set the range to millivolts. Then proceed to take the reading, applying the clamp in the same way as previously described.
When it comes to testing resistance, it’s important to make sure that the power is turned off in the object being tested, so as to avoid potential damage to the multimeter and inaccurate results. After setting the dial for resistance, connect the leads and conduct the reading.
Using a multimeter to test for continuity is a simple process. Set the dial to the appropriate continuity function and connect the lead. A beep will indicate solid continuity; no beep indicates no continuity.
Testing frequency is also straightforward, but requires that you have a set frequency with which to compare results. Set the dial to hertz (Hz) and then connect to the circuit. When the results are registered, compare them to the standard frequency for the given component.
No comments:
Post a Comment