Increasingly, metering devices are being designed so that they provide a direct readout. The number on the meter is the indication. It’s that simple. Such a meter is called a digital meter. The main advantage of a digital meter is the fact that it’s easy for anybody to read, and there is no chance for interpolation errors. This is ideal for utility meters, clocks, and some kinds of ammeters, voltmeters, and wattmeters. It works well when the value of the quantity does not change often or fast.
There are some situations in which a digital meter is a disadvantage. One good example is the signal-strength indicator in a radio receiver. This meter bounces up and down as signals fade, or as you tune the radio, or sometimes even as the signal modulates. A digital meter will show nothing but a constantly changing, meaningless set of numerals. Digital meters require a certain length of time to lock in to the current, voltage, power, or other quantity being measured. If this quantity never settles at any one value for a long enough time, the meter can never lock in.
Meters with a scale and pointer are known as analog meters. Their main advantages are that they allow interpolation, they give the operator a sense of the quantity relative to other possible values, and they follow along when a quantity changes. Some engineers and technicians prefer analog metering, even in situations where digital meters would work just as well.
One potential hang-up with digital meters is being certain of where the decimal point goes. If you’re off by one decimal place, the error will be by a factor of 10. Also, you need to be sure you know what the units are. For example, a frequency indicator might be reading out in megahertz, and you might forget and think it is giving you a reading in kilohertz. That’s a mistake by a factor of 1000! Of course, this latter type of error can happen with analog meters, too.