0% found this document useful (0 votes)
35 views21 pages

Chapter 2

Uploaded by

080bct035
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
35 views21 pages

Chapter 2

Uploaded by

080bct035
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

Chapter- 2

Theory of Measurement

Mandip Rai
1

Static Performance Parameters
• Static performance parameters are those parameters of
instrumentation system which does not vary with time.
• These parameters are critical for assessing the quality
and reliability of measurements.
• They include characteristics like accuracy, precision,
sensitivity, resolution, linearity, and others (e.g.,
repeatability, drift).

2
Accuracy
• Measures how close the instrument’s output is to the true or accepted value.
• Expressed as a percentage of full scale or reading (e.g., ±1% of full scale).
• High accuracy is essential in critical applications like medical devices or aerospace systems.
• Influenced by factors like calibration, environmental conditions, and sensor quality.
• Example:
Suppose a thermometer is used to measure boiling water, which is actually at 100°C.
If the thermometer reads 98°C, it is accurate to within 2°C. So, the error is 2%, and the accuracy is 98%.
Precision
• Refers to the instrument’s ability to consistently reproduce the same output under identical conditions.
• High precision indicates low random error.
• Important in applications where repeatability of measurements is critical (e.g., lab testing).
• Can be precise but not accurate if there is a systematic error.
• Example:
You measure the length of an object three times with a ruler and get 10.2 cm, 10.2 cm, and 10.2 cm.
Even if the actual length is 10.5 cm, the readings are precise but not accurate.

3
Sensitivity
• Defined as the ratio of output change to input change.
• Indicates how responsive the instrument is to small changes in input.
• High sensitivity is ideal for detecting minute changes (e.g., pressure, voltage).
• However, too high sensitivity may also amplify noise.
• Example:
A digital thermometer shows a 1°C increase for every 0.1V change in the sensor signal.So, its sensitivity is 1°C/
0.1V = 10°C/V.
Higher sensitivity helps detect even small temperature changes.
Resolution
• The smallest change in input that can be detected or displayed by the instrument.
• Limited by hardware (e.g., ADC resolution) and signal processing.
• Important for fine measurements, like in digital micrometers or precision balances.
• A higher resolution doesn’t always mean higher accuracy.
• Example:
A weighing scale that displays weight in increments of 0.01g has a resolution of 0.01g.
If an object gains 0.005g, the change may not be reflected due to limited resolution.

4
Linearity
• Indicates how well the output follows a straight-line relationship with the input.
• Perfect linearity means output is directly proportional to input across the range.
• Non-linearity leads to distortion in measurement and is usually corrected
through calibration or compensation circuits.
• Measured as a deviation from the ideal straight line (often as a percentage of
full scale).
• Example:
In a temperature sensor, if the input increases linearly (say 10°C, 20°C, 30°C),
and the output voltage increases proportionally (1V, 2V, 3V), the system is
linear.
If at 30°C, the voltage jumps to 3.5V instead of 3V, it shows non-linearity.

5
Dynamic Performance Parameters

• These parameters define how an instrument or system responds to time-


varying inputs.
• Important in real-time and control systems where signals change rapidly.
• Help evaluate speed, stability, and accuracy of response to dynamic
conditions.
• Common in applications like temperature control, robotics, sound
systems, and vibration analysis.
• Dynamic performance parameters are essential when input signals vary
over time.

6
Response Time
• Definition: Time taken by the output to reach a specified percentage (commonly 95% or 99%) of its final value after an
input step change.
• Includes:
• Delay time – Time before the output starts changing.
• Rise time – Time to go from 10% to 90% of final value.
• Settling time – Time to stay within a small range (e.g., ±5%) of the final value.
• Indicates: Speed of the system in responding to a sudden input change.
• Example: A digital thermometer placed in boiling water shows near-final reading in 2.5 seconds – that’s the response
time.
Frequency Response
• Definition: Describes the output behavior of a system when the input signal frequency varies.
• Importance:
• Evaluates ability to track periodic or oscillating signals.
• Shows whether output amplitude and phase remain accurate at higher frequencies.
• Includes: Gain (amplitude response) and phase shift at different frequencies.
• Example: A vibration sensor correctly follows input up to 1000 Hz, but shows signal distortion beyond that —
indicating the limit of usable frequency response.

7
Bandwidth
• Definition: The range of input frequencies over which the
instrument maintains an accurate, usable output (typically within
-3 dB of the max response).
• Unit: Hertz (Hz)
• Related to: Speed of response — wider bandwidth means the
system can follow faster signals.
• Example: An oscilloscope with a bandwidth of 10 MHz can
accurately capture signals that change at up to 10 million cycles
per second.

8
Error in Measurement
• Errors are deviations between the measured value and the true value.
• Understanding error types helps improve measurement accuracy and reliability.
Gross Error
• Definition: Errors due to human mistakes during observation, recording, or handling
of instruments.
• Causes:
• Misreading an instrument scale.
• Improper use of equipment.
• Incorrect data entry.
• Example: Reading 6.5 on a scale when the actual value is 5.5.
• Minimization: Careful observation, proper training, and use of automated systems.

9
Systematic Error
• Definition: Errors that are consistent and repeatable, caused by flaws in the measurement
system.
• Characteristics:
• Predictable and often constant over time.
• Can be calibrated or corrected.
There are two types of Systematic Error: Instrumental Error and Environmental Error.
1. Instrumental Error
• Definition: Caused by imperfections or limitations in the measuring instrument.
• Causes:
• Calibration errors.
• Zero error or scale misalignment.
• Friction in mechanical parts.
• Example: A voltmeter consistently reads 0.2V higher than actual voltage.
• Solution: Calibration, maintenance, and using higher-quality instruments.

10
2. Environmental Error
• Definition: Caused by environmental factors that affect measurement.
• Causes:
• Temperature changes.
• Humidity, pressure, magnetic or electric fields.
• Example: A balance gives inaccurate readings due to wind or vibrations in the room.
• Solution: Control environment or use compensation techniques.
Random Error
• Definition: Unpredictable, irregular fluctuations that occur in repeated measurements.
• Characteristics:
• Caused by unknown or uncontrollable variables.
• Affects precision more than accuracy.
• Example: Slight variations in timing when using a stopwatch manually.
• Solution: Perform multiple measurements and use statistical averaging.

11
Error Type Cause Effect Can be Reduced?

Gross Error Human mistakes Large, obvious deviation Yes, with care

Instrumental Error Faulty or miscalibrated instruments Constant bias Yes, by calibration

Environmental Error External conditions Consistent shift Yes, by control

Random Error Unknown random influences Inconsistent readings Yes, by averaging

12
Statistical Analysis of Error in Measurement
1. Mean (Average) Value
• Represents the central tendency of repeated measurements.
• Provides the best estimate of the true value when random errors dominate.
• Sensitive to Outliers: Extreme values can significantly affect the mean.
• Example: In the dataset 1, 2, 3, 4, 100, the mean is 22, which is skewed by the outlier (100).
• Uses All Data Points: Every value in the dataset contributes to the mean.
• Algebraic Manipulation: The mean can be used in further statistical calculations (e.g., standard
deviation, variance).
• Unique Value: A dataset has only one arithmetic mean.
• When to Use the Mean?
• When the data is normally distributed (symmetrical).
• When there are no extreme outliers.
• When working with interval or ratio data (numeric scales).
13
2. Deviation
• Deviation refers to the difference between an observed value and a reference point (such as the mean,
expected value, or a standard).
• Useful for evaluating individual measurement error.
3. Average Deviation
• Mean of the absolute values of deviations.
• Formula: n
1
n∑
Average Deviation = xi − x̄
i=1

4. Standard Deviation (σ)


• Key indicator of spread or scatter of measurements.
• Lower σ = higher precision.
• Widely used in quality control and uncertainty analysis.

14
5. Confidence Level and Confidence Interval
• Indicates how sure you are that the measured average includes the true value.
• Common levels: 95%, 99%.
• Wider intervals at higher confidence levels.
• Formula for 95% CI:
σ
CI = x̄ ± z ⋅
n
where z=1.96 for 95% con dence.

6. Variance (σ²)
• Square of standard deviation.
• Used in statistical modeling and error propagation analysis.
15
fi
7. Percentage Error
• Useful for comparing error relative to the magnitude of the quantity measured.
• Formula:
Measured Value − True Value
Percentage Error = × 100 %
True Value

8. Root Mean Square Error (RMSE)


• A measure of the magnitude of error.
• Used especially when comparing experimental vs. theoretical values.
• Formula:
n
1
(yi − yî )
2
n∑
RMSE =
i=1

16
Measurement of Resistance (Low, Medium and High)
Measurement
Resistance Range Key Features Real-World Example
Method

Low Resistance Kelvin Double Eliminates lead resistance, Measuring shunt resistors in power supplies
(<1Ω) Bridge (4-Wire) high precision or motor windings in EVs.

Potentiometer Compares with a standard Calibrating precision current sensors in


Method resistor industrial equipment.

Medium Resistance Digital Multimeter Fast, portable, moderate Testing resistors on a PCB or
(1Ω – 100kΩ) (DMM) accuracy checking thermistor values in HVAC systems.

High accuracy, manual Measuring strain gauges in structural


Wheatstone Bridge
balancing engineering.

High Resistance Megohmmeter High-voltage insulation Checking cable insulation in power lines
(>100kΩ) (Megger) testing or motor winding integrity.

Measures ultra-high Testing semiconductor leakage in microchips


Electrometer Method
resistance (TΩ range) or insulation in spacecraft components.
17
18
19
20
Thank you

21

You might also like