Emm Unit 1
Emm Unit 1
METROLOGY AND
MEASUREMENTS
1
OBJECTIVES
2
UNIT I - BASICS OF METROLOGY
3
UNIT II - LINEAR AND ANGULAR
MEASUREMENTS
4
UNIT III - ADVANCES IN METROLOGY
5
UNIT IV - FORM MEASUREMENT
6
UNIT V - MEASUREMENT OF POWER, FLOW
AND TEMPERATURE
7
• OUTCOMES:
Upon completion of this course, the Students can demonstrate
different measurement technologies and use of them in Industrial
Components.
• TEXT BOOKS:
1. Jain R.K. “Engineering Metrology”, Khanna Publishers, 2005.
2. Gupta. I.C., “Engineering Metrology”, Dhanpatrai Publications,
2005.
• REFERENCES:
1. Charles Reginald Shotbolt, “Metrology for Engineers”, 5th
edition, Cengage Learning EMEA,1990.
2. Backwith, Marangoni, Lienhard, “Mechanical Measurements”,
Pearson Education, 2006.
8
Unit I - Basics of
metrology
• Introduction to Metrology
• Need of Measurement
• Elements of Metrology
• Standard, Work piece, Instruments, Persons,
Environment (SWIPE)
•Their effect on Precision and Accuracy
•Errors – Errors in Measurements – Types –
Control
•Types of standards.
9
INTRODUCTION TO METROLOGY
What is measurement?
Our daily activities:
• Buying things — grocery, gold, petrol, cloth
• Measuring emission level of vehicle
• Inspecting work pieces produced in a shift
• Drug delivery to a patient
• Measuring blood pressure, sugar level, body
temperature
The process of measurement gives a number relating the
item (feature) under study and the referenced unit of
measurement.
10
INTRODUCTION TO METROLOGY
• WHAT IS A MEASUREMENT?
• A measurement is an act of assigning
a specific value to a physical variable
• The physical variable becomes the
measured variable
11
INTRODUCTION TO METROLOGY
• Metrology is the science of
measurement.
• In practical applications, it is the
enforcement, verification and validation
of predefined standards.
• Metrology is also concerned with the
industrial inspection and its various
techniques.
12
INTRODUCTION TO METROLOGY
• Metrology may be divided depending
upon the quantity to be measured like
metrology of length, metrology of time.
• But for engineering purposes, it is
restricted to measurement of length and
angles and other qualities which are
expressed in linear or angular terms.
• In the broader sense it is not limited to
length measurement but it is also
concerned with industrial inspection and
its various techniques.
13
INTRODUCTION TO METROLOGY
• Metrology is mainly concerned with:
(1) Establishing the units of measurements,
ensuring the uniformity of measurements.
(2) Developing methods of measurement.
(3) Errors of measurement.
(4) Accuracy of measuring instruments and
their care.
(5) Industrial inspection and its various
techniques.
14
INTRODUCTION TO METROLOGY
• Measurement encompasses different fields
such as communications, energy, medical
sciences, food sciences, environment,
trade, transportation, and military
applications.
• Metrology concerns itself with the study of
measurements.
• Measurement is an act of assigning an
accurate and precise value to a physical
variable.
15
INTRODUCTION TO METROLOGY
Metrology Covers Three Main Tasks:
• The definition of internationally accepted
units of measurement.
• The realization of units of measurement by
scientific method.
• Establishment of traceability chain in
documenting the accuracy of a measurement.
16
TYPES OF METROLOGY
• Legal metrology
17
SCIENTIFIC OR FUNDAMENTAL METROLOGY
• It deals with the establishment of
quantity systems, unit systems, units of
measurement, the development of new
measurement methods, realization of
measurement standards and the transfer
of traceability from these standards to
users in society.
• In India, National Physical Laboratory is
the custodian of various primary
standards. 18
APPLIED, TECHNICAL OR INDUSTRIAL METROLOGY
• It deals with the application of measurement
science to manufacturing processes and their
use in the society.
•Ensures the suitability of measurement
instruments, their calibration and quality
control of products.
• Emphasis in this area of metrology is on the
measurements themselves, and traceability
of the calibration of the measurement devices
to ensure confidence in the measurements.
19
LEGAL METROLOGY
• Legal Metrology
Legal metrology is directed by a
national organization which is called
national service of Legal metrology.
It includes a no. of international
organization whose ultimate object
is to maintain uniformity of
measurement throughout the world.
20
LEGAL METROLOGY
• It deals with the activities which result
from statutory requirements.
• It is concerned with legal requirements
of measurement processes, units of
measurement, measuring instruments
and methods of measurement.
• To establish necessary rules and
regulations on qualities and control of
measuring instruments and their use.
21
LEGAL METROLOGY
22
THE ACTIVITIES OF LEGAL METROLOGY
• The activities of legal metrology are:
(1) Control of measuring instruments.
(2) Testing of prototype/models of
measuring instruments.
(3) Examination of measuring
instrument to verify its conformity.
23
NEED OF MEASUREMENT/INSPECTION
•Measurement is the process at which any
material(feature) can be quantified.
•In industrial sense both the terminologies are used
identically.
• To ensure that the products supplied to the customer
are within the agreed specifications.
• To monitor the process performance. This will ensure
that the number of rejects is as small as is economically
practicable.
• To ensure that the raw materials, purchased parts and
components conform to the purchasers specifications.
24
NEED OF MEASUREMENT/INSPECTION
•To meet the interchangeability concept (ie the
diverse components produced in mass must be fit
and mate if any component is chosen at random).
•To evaluate the possibility of rework of defective
parts.
•To exclude sources of error and deficiencies in the
processes.
•To establish limit gauging.
•To achieve reverse engineering.
•To augment the reputation of the manufacturer and
to help him to become a world class manufacturer.
25
QUALITY CONTROL –
METROLOGY AS A MEANS TO ACHIEVE
26
QUALITY CONTROL –
METROLOGY AS A MEANS TO ACHIEVE
Quality control enables an inspector to sample the part being
produced in a mathematical manner and to determine whether
Or not the entire stream of production is acceptable.
The following steps must be taken while using quality control
techniques.
1) Sample the stream of product.
2) Measure the desired dimension.
3) Calculate the deviations of the dimensions from the mean
dimension.
4) Construct a control chart.
5) Plot succeeding data on the control chart.
27
OBJECTIVES OF METROLOGY
• The basic objective of a measurement is to
provide the required accuracy at a minimum
cost.
1. Complete evaluation of newly developed
products.
2. Determination of Process Capabilities.
3. Determination of the measuring instrument
capabilities and ensure that they are quite
sufficient for their respective
measurements.
4. Minimizing the cost of inspection by
effective and efficient use of available
facilities. 28
OBJECTIVES OF METROLOGY
5. Reducing the cost of rejects and rework
through application of statistical quality
control techniques.
6. To standardize the measuring methods.
7. To maintain the accuracies of measurement.
8. To prepare design for all gauges and special
inspection fixtures.
29
PROCESS OF MEASUREMENT
30
• There are main three important elements of
measurement,
(1) Measurand is the physical quantity or property like
length, angle, diameter, thickness etc. to be
measured.
(2) Reference:
-the physical quantity or property to which
quantitative comparisons are to be made, which is
internationally accepted.
(3) Comparator:
- to compare the measurand (physical quantity)
with a known standard (reference) for evaluation.
31
• Suppose a fitter has to measure the
length of M.S. plate- he first lays his rule
along the flat. He then carefully aligns
the zero end of his rule with one end of
M.S. flat and finally compares the length
of M.S. flat with the graduations on his
rule by his eyes. In this example, the
length of M.S. flat is a measurand, steel
rule is the reference and eye can be
considered as a comparator
32
General Measurement System
✔ Most measurement system may consist of
part or all of four general stages:
34
General Measurement System
•Sensor – Transducer Stage
The primary function of the first stage is to
detect or to sense the physical variable
(Measurand) and performs either a mechanical or
an electrical transformation to convert the signal
into a more usable form.
35
General Measurement System
•Output Stage
Provides an indication of the value of the
measurement. The output equipment might be a
simple readout display a marked scale or might
contain devices that can record the signal for later
analysis.
39
1. Direct method
• This is a simple method of measurement, in
which the value of the quantity to be measured is
obtained directly without the calculations.
• For example, measurements by scales, Vernier
calipers, micrometers, bevel protector etc.
• This method is most widely used in production.
This method is not very accurate because it
depends on human judgment.
40
2. Indirect Method
• In this method, the value of a quantity is
obtained by measuring other quantities
that are functionally related to the
required value. Measurement of the
quantity is carried out directly and then
the value is determined by using a
mathematical relationship.
• Some examples of indirect measurement
are angle measurement using sine bar,
measurement of strain induced in a bar
due to the applied force, determination
of effective diameter of a screw thread,
etc.
41
2. Indirect Method
42
3. Comparative Method
In which the value of the quantity to be
measured is compared with a known
value of the same quantity or another
related quantity. In this method, only
deviations from master gauges are
recorded.
Eg: Use of dial indicator
as a comparator.
43
4. Coincidence Method
• This is a differential method of
measurement wherein a very minute
difference between the quantity to be
measured and the reference is
determined by careful observation of
the coincidence of certain lines and
signals. Measurements on Vernier
caliper and micrometer are examples of
this method.
44
5. Fundamental or absolute Method
45
6. Contact Method
• Sensor/Measuring tip touch the surface
area.
• Ex: Vernier Caliper, Temperature sensor
46
7. Transposition Method
47
8. Complementary Method
• The value of quantity to be
measured is combined with known
value of the same quantity.
• Ex: Determination of Volume by
liquid displacement.
48
9. Deflection Method
49
Selection of Measuring Instruments
Essential factors for proper selection of measuring
instrument:
• Range of the instrument.
• Resolution(discrimination) of the instrument.
• Accuracy expected — never demand an accuracy of
measurement higher than really needed, higher the degree
of accuracy higher the cost of measuring instrument.
• Installation requirements — mounting requirement,
vibration isolation, compressed air requirement, wireless
system, distance between the place of measurement and
control room, ambient conditions, etc.
• Final data requirement — immediate or later use.
50
Selection of Measuring Instruments
• Data format required — Indication or recording.
• Difficult to access environment—probes
• Cost factor- Advanced technologies
• Nature of measurand — static or dynamic.
• What is the parameter to be measured? — length,
diameter, surface finish...
• Measurement skill needed.
• Life expectancy/Stability
• Environmental effects — whether readings are
affected by changes in pressure, temperature,
etc.?
51
Environmental Control in Test Laboratory
• To perform accurate and reliable measurements in test
laboratories and to fulfill the requirements of
compatibility, it is absolutely essential that the
environment in test laboratory be strictly controlled and
maintained.
• Environmental influences and affect the accuracy of
measurement and thus introduce uncertainty.
• Various environmental factors to be controlled in test
laboratories, their effect on measurement, the
recommended values for such parameters and their
monitoring should be strictly followed.
• Precision Environments applies the right technology
based on specific laboratory design requirements.
52
Environmental Parameters
• Temperature
• Rate of Change of Temperature
• Relative humidity
• Barometric Pressure
• Air velocity and air distribution
• Dust particle Count
• Vibration and Shock
• Acoustic noise
• Illumination
• Magnetic fields
53
Metrology Lab Environment
54
Metrology Lab Environment
55
Metrology Lab Environment
56
Metrology Lab Environment
57
Metrology Lab Environment
58
Metrology Lab Environment
59
ACCURACY
• Accuracy is defined as the closeness of the
measured value with true value.
OR
• Accuracy is defined as the degree to which
the measured value agrees with the true
value.
• Practically it is very difficult to measure the
true value and therefore a set of observations
is made whose mean value is taken as the
true value of the quantity measured.
60
PRECISION
• A measure of how close repeated trials are to each
other.
OR
• The closeness of repeated measurements.
• Precision is the repeatability of the measuring
process. It refers to the group of measurements for
the same characteristics taken under identical
conditions.
• It indicated to what extent the identically performed
measurements agree with each other.
• If the instrument is not precise it will give different
results for the same dimension when measured again
and again.
61
DISTINCTION BETWEEN PRECISION AND
ACCURACY
62
DISTINCTION BETWEEN PRECISION AND
ACCURACY
• Figure shows the difference between the
concepts of accuracy versus precision using a
dartboard analogy that shows four different
scenarios that contrast the two terms.
• A: Three darts hit the target center and are very
close together = high accuracy and precision
• B: Three darts hit the target center but are not
very close together = high accuracy, low precision
• C: Three darts do not hit the target center but are
very close together = low accuracy, high precision
• D: Three darts do not hit the target center and
are not close together = low accuracy and
precision 63
Precision & Accuracy
• Figure also depicts clearly the difference
between the precision and accuracy.
a) Precise but not (b) Accurate but not (c) Precise and d) Not Precise and
not Accurate Precise Accurate Accurate
64
Precision & Accuracy
65
Precision & Accuracy
Or
67
FACTORS AFFECTING THE ACCURACY OF THE
MEASURING SYSTEM
• Factors affecting the accuracy of the
measuring system such as:
1.Factors affecting the calibration standards.
2.Factors affecting the work piece.
3.Factors affecting the inherent characteristics of
the instrument.
4.Factors affecting the person, who carries out
the measurements.
5.Factors affecting the environment.
68
Factors affecting the calibration standards
• Coefficient of thermal expansion.
• Calibration interval.
• Stability with time.
• Elastic properties.
• Geometric compatibility.
69
Factors affecting the workpiece
•Cleanliness, surface finish, waviness,
scratch and surface defects, etc.
•Hidden geometry.
•Elastic properties.
•Adequate datum on the workpiece.
•Thermal equalization etc.
70
Factors affecting the inherent characteristics of the
instrument
•Adequate amplification for necessary objective.
•Scale error.
•Effect of friction, backlash, hysteresis and zero drift
error.
•Deformation in handling or use, when heavy work
pieces are measured.
•Calibration errors.
•Mechanical parts (slides, guide ways or moving
elements).
•Repeatability and readability.
•Contact geometry for both workpiece and standard.
71
Factors affecting the person
• Training and Skill.
• Sense of precision appreciation.
• Ability to select measuring instruments and
standard.
•Attitude towards personal accuracy
achievement.
•Planning measurement techniques for
minimum cost, consistent and precision
requirement etc.
72
Factors affecting environment
• Temperature, humidity, etc.
• Clean surroundings, and minimum vibration.
• Adequate illuminations.
• Temperature equalization between standard,
workpiece, and instrument.
• Thermal expansion affects due to heat
radiation from lights, heating elements,
sunlight and people.
• Manual handling may also introduce thermal
expansion.
73
Effects of Environment on Precision
•Many applications specify tolerances in microns. It may be
appreciated that 25 mm of steel will lengthen about 0.3 microns
when its temperature is increased by 1°C.
•Obviously for precision measurements in terms of microns, error
of 0.3 microns just by change of 1°C temperature is substantial.
•A piece of steel held in the hand absorbs heat very fast and up to
5°C rise in temperature can be expected in 5 minutes time, but it
would take hours for it to cool to the laboratory temperature and
obtain its original length.
•It would thus be realized that temperature has a great influence
on precision measurements. For such measuring instruments, it is
essential that the gauge blocks and work pieces are handled with
insulated forceps, or tweezers, with plastic pads or gloves.
74
Effects of Environment on Precision
•Usually a plastic shield is introduced in
between the inspector and the machine so as
not to influence temperature of machine
surroundings due to presence by body radiant
heat or hot breath or in any other way.
•In some interferometers, the machine is
entirely enclosed in a box with transparent
plastic and the operator manipulates the parts
with long-handles, insulated forceps
introduced through self-sealing rubber part
holes
75
Calibration of Measuring Instruments
•The process of validation of the measurements to ascertain
whether the given physical quantity conforms to the
original/national standard of measurement is known as
traceability of the standard.
•Calibration is the procedure used to establish a relationship
between the values of the quantities indicated by the
measuring instrument and the corresponding values realized
by standards under specified conditions.
•If the values of the variable involved remain constant (not time
dependent) while calibrating a given instrument, this type of
calibration is known as Static calibration,
•whereas if the value is time dependent or time-based
information is required, it is called Dynamic calibration.
76
PERFORMANCE OF INSTRUMENTS
• All instrumentation systems are characterized
by the system characteristics or system
response.
• It consists of two basic characteristics such as
static and dynamic.
• If the instrument is required to measure a
condition not varying with time are called
static characteristics
• If the instrument is required to measure a
condition varying with time are called
dynamic characteristics
77
PERFORMANCE OF INSTRUMENTS
• Static response:
– The static characteristics of an instrument are considered
for instruments which are used to measure an unvarying
process conditions.
• Dynamic response:
– The behaviors of an instrument under such time varying
input – output conditions is called dynamic response of an
instrument.
– The instrument analysis of such dynamic response is called
dynamic analysis of the measurement system.
• Dynamic quantities are of two types
– Steady state periodic
– Transient
78
Errors in measurements
• It is never possible to measure the true
value of a dimension, there is always
some error.
• The error in the measurement is the
difference between the measured value
and the true value of measured
dimensions.
• The error in measurement may be
expressed either as on absolute error or
as a relative error. 79
Errors in measurement
•While performing physical measurements, it is
important to note that the measurements obtained
are not completely accurate, as they are associated
with uncertainty.
•Thus, in order to analyse the measurement data, we
need to understand the nature of errors associated
with the measurements.
•Therefore, it is imperative to investigate the causes
or sources of these errors in measurement systems
and find out ways for their subsequent elimination.
•Two broad categories of errors in measurement have
been identified: systematic and random errors.
80
Systematic or Controllable Errors
•A systematic error is a type of error that deviates by a
fixed amount from the true value of measurement.
•These types of errors are controllable in both their
magnitude and their direction, and can be assessed and
minimized if efforts are made to analyse them.
•When the systematic errors obtained are minimum, the
measurement is said to be extremely accurate.
•The following are the reasons for their occurrence:
1. Calibration errors
2. Ambient conditions
3. Deformation of workpiece
4. Avoidable errors
81
Calibration Errors
•A small amount of variation from the nominal value will
be present in the actual length standards, as in slip
gauges and engraved scales.
•Inertia of the instrument and its hysteresis effects do
not allow the instrument to translate with true fidelity.
•Hysteresis is defined as the difference between the
indications of the measuring instrument when the
value of the quantity is measured in both the ascending
and descending orders.
•Calibration curves are used to minimize such variations.
Inadequate amplification of the instrument also affects
the accuracy.
82
Avoidable Errors
These include the following:
•Datum error
•Reading errors
•Misreading of the instrument
•Errors due to parallax effect
•Effect of misalignment
•Zero Errors
83
Avoidable Errors
•Datum errors is the difference between the true value of
the quantity being measured and the indicated value,
with due regard to the sign of each. When the instrument
is used under specified conditions and a physical quantity
is presented to it for the purpose of verifying the setting,
the indication error is referred to as the datum error.
•Reading errors These errors occur due to the mistakes
committed by the observer while noting down the values
of the quantity being measured. Digital readout devices,
which are increasingly being used for display purposes,
eliminate or minimize most of the reading errors usually
made by the observer.
84
Avoidable Errors
•Errors due to parallax effect Parallax errors occur when the sight
is not perpendicular to the instrument scale or the observer
reads the instrument from an angle. Instruments having a scale
and a pointer are normally associated with this type of error. The
presence of a mirror behind the pointer or indicator virtually
eliminates the occurrence of this type of error.
•Effect of misalignment These occur due to the inherent
inaccuracies present in the measuring instruments. These errors
may also be due to improper use, handling, or selection of the
instrument. Wear on the micrometer anvils or anvil faces not
being perpendicular to the axis results in misalignment, leading
to inaccurate measurements. If the alignment is not proper,
sometimes sine and cosine errors also contribute to the
inaccuracies of the measurement.
85
Avoidable Errors
Zero errors When no measurement is
being carried out, the reading on the
scale of the instrument should be zero. A
zero error is defined as that value when
the initial value of a physical quantity
indicated by the measuring instrument is
a non-zero value when it should have
actually been zero.
86
Random Errors
• Random errors provide a measure of random deviations
when measurements of a physical quantity are carried
out repeatedly.
• When a series of repeated measurements are made on a
component under similar conditions, the values or
results of measurements vary.
• Specific causes for these variations cannot be
determined, since these variations are unpredictable and
uncontrollable by the experimenter and are random in
nature. They are of variable magnitude and may be
either positive or negative.
• When these repeated measurements are plotted, they
follow a normal or Gaussian distribution.
87
Random Errors
The following are the likely sources of random errors:
• Presence of transient fluctuations in friction in the
measuring instrument. Play in the linkages of the
measuring instruments.
• Error in operator’s judgment in reading the
fractional part of engraved scale divisions.
• Operator’s inability to note the readings because of
fluctuations during measurement.
• Positional errors associated with the measured
object and standard, arising due to small variations
in setting.
88
Difference between Systematic and Random Errors
89
Classification of Standards
• Accuracy is one of the most
important factors to be
maintained and should always
be traceable to a single source,
usually the national standards of
the country.
• National laboratories of most of
the developed countries are in
close contact with the BIPM.
• This is essential because
ultimately all these
measurements are compared
with the standards developed
and maintained by the bureaus
of standards throughout the
world. 90
Line and End Measurements
• When the distance between two engraved lines is used to
measure the length, it is called line standard or line
measurement.
• The most common examples are yard and metre. The rule with
divisions marked with lines is widely used.
• When the distance between two flat parallel surfaces is
considered a measure of length, it is known as end standard or
end measurement.
• The end faces of the end standards are hardened to reduce wear
and lapped flat and parallel to a very high degree of accuracy.
• The end standards are extensively used for precision
measurement in workshops and laboratories.
• The most common examples are measurements using slip
gauges, end bars, ends of micrometer anvils, vernier callipers,
etc. 91
Characteristics of Line Standards
The following are the characteristics of line standards:
• Measurements carried out using a scale are quick and
easy and can be used over a wide range.
• Even though scales can be engraved accurately, it is
not possible to take full advantage of this accuracy.
The engraved lines themselves possess thickness,
making it difficult to perform measurements with
high accuracy.
• The markings on the scale are not subjected to wear.
Undersizing occurs as the leading ends are subjected
to wear.
92
Characteristics of Line Standards
• A scale does not have a built-in datum,
which makes the alignment of the scale
with the axis of measurement difficult.
This leads to undersizing.
• Scales are subjected to parallax effect,
thereby contributing to both positive and
negative reading errors.
• A magnifying lens or microscope is
required for close tolerance length
measurement. 93
Characteristics of End Standards
End standards comprise a set of standard blocks
or bars using which the required length is created.
The characteristics of these standards are as
follows:
• These standards are highly accurate and ideal
for making close tolerance measurement.
• They measure only one dimension at a time,
thereby consuming more time.
• The measuring faces of end standards are
subjected to wear.
94
Characteristics of End Standards
• They possess a built-in datum because their
measuring faces are flat and parallel and can be
positively located on a datum surface.
• Groups of blocks/slip gauges are wrung together
to create the required size; faulty wringing leads
to inaccurate results.
• End standards are not subjected to parallax errors,
as their use depends on the feel of the operator.
• Dimensional tolerance as close as 0.0005 mm can
be obtained.
95
Comparison between Line & End Standards
S. Characteristic Line Standard End Standard
No
1 Principle Length is expressed as the Length is
distance between two lines expressed as the
distance
between two
flat parallel
faces
2 Accuracy Limited to ± 0.2 mm for High accurate
high accuracy, scales have for
to be used in conjuction measurement of
with magnifying glass or close tolerances
microscope up to ± 0.001
mm.
96
S. Characteristic Line Standard End Standard
No
3 Ease and time Measurement is quick The use of end
& and easy standards
measurement requires skill
and is
time-consumin
g.
4 Effect of wear Scale markings are not These are
subject to wear. subjected to
However, significant wear on their
wear may occur on measuring
leading ends. Thus it may surfaces.
be difficult to assume
zero of scale as a datum.
97
S. Characteristic Line Standard End Standard
No
98
S. Characteristic Line Standard End Standard
No
99