0% found this document useful (0 votes)
128 views29 pages

Instrumentation & Error

This document provides an overview of instrumentation and measurement. It defines instrumentation as a branch of engineering related to studying instruments and their control. An instrument is defined as a device that measures a physical or electrical quantity. Measurement is quantified as comparing an unknown quantity to a standard. Accuracy, resolution, precision, and error are also defined as they relate to instrumentation. Examples of common instruments are provided. The document discusses analog and digital instruments, as well as transducers, signal modifiers, and indicating devices that make up electronic measurement instruments. It also covers units, types of errors, and statistical analysis of measurements.

Uploaded by

zubairaw
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
128 views29 pages

Instrumentation & Error

This document provides an overview of instrumentation and measurement. It defines instrumentation as a branch of engineering related to studying instruments and their control. An instrument is defined as a device that measures a physical or electrical quantity. Measurement is quantified as comparing an unknown quantity to a standard. Accuracy, resolution, precision, and error are also defined as they relate to instrumentation. Examples of common instruments are provided. The document discusses analog and digital instruments, as well as transducers, signal modifiers, and indicating devices that make up electronic measurement instruments. It also covers units, types of errors, and statistical analysis of measurements.

Uploaded by

zubairaw
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 29

Introduction

What is Instrumentation?
Instrumentation is a branch of engineering, related to study of
various instruments and their control.
Instruments
An instrument is a device that measures a physical or electrical
quantity such as flow, temperature, current, voltage, level, distance,
angle, or pressure.
What is measurement?
The measurement of a given parameter or quantity is the act of a
quantitative comparison between a predefined standard and an
unknown quantity to be measured.
Expected Value
The design value that is the most probable value that calculations
indicate one should expect to measure.
Accuracy
The degree of exactness of a measurement compared to the expected
value, or the most probable value, of the variable being measured.
Resolution
The smallest change in a measured variable to which an instrument
will response
Precision
A measure of the consistency or repeatability of measurements.
Scientists, engineers and other humans use a vast range of instruments to
perform their measurements.
These instruments may range from simple objects such as rulers and
stopwatches to electron microscopes and particle accelerators. Virtual
instrumentation is widely used in the development of modern measuring
instruments.
examples around us
• electricity meter
• gas meter
• odometer
• speedometer
• temperature gauge
• fuel meter
• clock
• weight scale
• measuring cup
Advantages of Electrical Instruments
• Different physical quantities can be converted into electrical
signal by transducers.
• Electrical signal can amplified, multiplexed, filtered and measured
easily.
• Electrical signal can be converted from A/D or D/A signal.
• Electrical signals can be transmitted over long distances by wire
or radio link etc.
• Many measurements can be carried simultaneously.
• Digital signal are compatible with computers.
• High Sensitivity, low power consumption, high reliability.
Use of Block Diagrams
Any instrument or measuring can be represented by a block
diagram, that indicates necessary elements and its functions.

The entire operation of a measuring system can be understand


from the following block diagram.
Analog Instruments
The instruments with a scale and moveable pointer where the angle
of deflection of pointer is a function of the electrical quantity being
measured is known as analog instrument.
Digital Instruments
The instruments that measure an electrical signal in small digital
steps and shows in digits is known as digital instrument
Nuclear Power Plant Instruments
Instruments of a jumbo jet aircraft
`

Modern Car Nissan GTR Instruments


Elements of Electronic Measurement
Electronic measurement instruments are made up of the three
elements.
1. Transducer
2. Signal Modifier
3. Indicating Device
Transducer
A transducer converts a nonelectrical signal into electrical signal.
Converts one form of energy into other.
Examples: transducers of temperature, pressure, weight measurement

Signal Modifier
A signal modifier is required to process the incoming electrical signal
into a useful information by filtering, amplifying and modifying the
signal.

Indicating Device
Indicating device is a deflection or indicating system which shows the
output on a specific scale.
Physical Units
• Electric Charge Q
• Electric Current I
• Electromotive Force (Potential Difference) V
• Resistance R
• Inductance L
• Capacitance C
SI Units
Error
The degree to which measurement conforms to the expected value is
expressed in the terms of the error of the measurement.

where
Percent Error
• Percent error

Relative accuracy
Example: The expected value of the voltage across a resistor is 50V however
measurement yields a value of 49V calculate (a) The absolute error (b) the
percent of error (c) the relative accuracy (d) the percent of accuracy

(a)

Error = 50-49 = 1 V
(b)
= 1/50 x 100 = 2 %
(c)
= 1- 1/50 = 0.98
(d) a = 100% - percent error = 100 – 2 = 98 %
Precision
• The precision of a measurement is a quantitative, or numerical,
indication of the closeness with which a repeated set of
measurements of the same variable agrees with the average of the
set of measurements.
Example: The following set of ten measurement was recorded in
the lab. Calculate the precision of the 4th measurement.

= (98+102+101+… 99)/ 10
= 101.1

= 1- 0.04 = 0.96
Types of Errors
Gross Errors
Gross errors are generally the fault of the person using the instrument
and are due to such thing as incorrect reading of instruments, incorrect
recoding of experimental data , or incorrect use of instruments.
Systematic Error
Systematic errors are due to problems with instruments,
environmental effects, or observational errors. These errors recur if
several measurements are made of the same quantity under the same
conditions.
Following are the main systematic errors:
1. Instrument Errors, 2. Environmental Errors, 3. Observational Errors
i. Instrument Errors
Instrument errors may be due to friction in bearings of the meter
measurement, incorrect error spring tension, improper
calibration, or faulty instruments. Instrument error can be reduced
by proper maintenance, use, and handling of instruments.
ii. Environmental Errors
Environmental conditions in which instruments are used may
cause errors. Subjecting instruments to harsh environments such
as high temperature, pressure, or humidity, or strong electrostatic
or electromagnetic fields may have detrimental effects, thereby
causing error.
iii. Observational Errors
Observational errors are those errors introduced by the observer.
the two most common observational errors are probably the
parallax error introduced in reading a meter scale and the error of
estimation when obtaining a reading from a meter scale.
• Random Errors
Random errors are those that remain after the gross and
systematic error have been substantially reduced, or at least
accounted for.
Random errors are generally the accumulation of a large number
of small effects and may be of real concern only in measurements
requiring a high degree of accuracy. Such errors can only be
analyzed statistically.
Statistically Analysis of Measurement
Statistical Analysis allow us to obtain mean value, average
deviation of the data. This helps us to make quantitative
judgments of the variation, error in data.
Average
An average is the arithmetic mean, which is the sum of the set of
numbers divided by the total number of pieces of data.

Where
Deviation
Deviation is the difference between each piece of test data and
the arithmetic mean. The deviation of , …. From their mean is
denoted by , …. and is defined as

The algebraic sum of the deviation of a set of numbers from


their arithmetic sum is zero
Example: For the following data compute, x1= 50.1
(a) the arithmetic mean x2 = 49.7
(b) The deviation of each value x3 = 49.6
(c) The algebraic sum of the derivations
x4 = 50.2
• Mean

= 49.9
• Deviation
= 50.1-49.9 = 0.2
= 49.7-49.9 = -0.2
= 49.6-49.9 = -0.3
= 50.2-49.9 = 0.3
The algebraic sum of the derivations
dtot = 0.2 – 0.2 – 0.3 + 0.3 = 0
Standard Deviation
The standard deviation S for a set of values is the degree to
which the values vary about the average value. The standard
deviation of a set of n numbers is
x1= 50.1
Example: Compute the standard
deviation for the data x2 = 49.7
x3 = 49.6
x4 = 50.2

=
=
= 0.294
Limiting Error
Full scale reading is guaranteed to be within the limits of a
perfectly accurate reading by the manufacturer. Therefore it is
important to obtain measurements as close as possible to full
scale.
Example: A 300 V voltmeter is specified to be accurate
within +2% at full scale. Calculate the limiting error when
the instrument is used to measure a 120 V source.
• The magnitude of the limiting error is
» = 0.02 x 300 V = 6 V
• Therefore, the limiting error at 120 V is
»=

You might also like