Ampere Meter Indicator

Ampere Meter:
An Ampere Meter is a measuring instrument used to measure the current in a circuit. Electric currents are measured
in amperes (A), hence the name. Instruments used to measure smaller currents, in the milliampere or microampere
range, are designated as milliammeters or microammeters. Early amperemeters were laboratory instruments which
relied on the Earth’s magnetic field for operation. By the late 19th century, improved instruments were designed
which could be mounted in any position and allowed accurate measurements in electric power systems. It is
generally represented by letter ‘A’ in a circuit.
In much the same way as the analogue ammeter formed the basis for a wide variety of derive d meters, including
voltmeters, the basic mechanism for a digital meter is a digital voltmeter mechanism, and other types of meter are
built around this.
Digital ammeter designs use a shunt resistor to produce a calibrated voltage proportional to the current flowing, This
voltage is then measured by a digital voltmeter, through use of an analog-to digital converter (ADC); the digital
display is calibrated to display the current through the shunt. Such instruments are often calibrated to indicate the
RMS value for a sine wave only, but many designs will indicate true RMS within limitations of the wave crest factor.
The majority of ammeters are either connected in series with the circuit carrying the current to be measured (for
small fractional amperes), or have their shunt resistors connected similarly in series. In either case, the current passes
through the meter or (mostly) through its shunt. Ammeters must not be connected directly across a voltage source
since their internal resistance is very low and excess current would flow. Ammeters are designed for a low voltage
drop across their terminals, much less than one volt; the extra circuit losses produced by the ammeter are called its
“burden” on the measured circuit.



Leave a Reply