CEC348 Remote Sensing Lecture Notes 1
CEC348 Remote Sensing Lecture Notes 1
com
COURSE MATERIAL
REMOTE SENSING
UNIT I & II
for
III YEAR / VI SEMESTER
ACADEMIC YEAR 2020-21
www.EnggTree.com
BY
Dr.V.SENTHILKUMAR
ASSOCIATE PROFESSOR
DEPARTMENT OF ELECTRONICS AND COMMUNICATION
www.EnggTree.com
www.EnggTree.com
surface without actually being in contact with it. This is done by sensing and recording
reflected or emitted energy and processing, analyzing, and applying that information." In
much of remote sensing, the process involves an interaction beteen incident adiation
and the targets of interest. This is exemplified by the use of imaging systems where the
following seven elements are involved. However, that remote sense also involves the
sensing of emitted energy and the use of non-imaging sensors.
The types or characteristics of platform depend on the type of sensor to be attached and its
application.
Type of Platforms:
Platforms can vary from stepladders to satellites.
There are different types of platforms and based on its altitude above earth surface.
Three types of platforms are used to mount the remote sensors
1. Ground based Platform
2. Air - borne Platform, and
3. Space-borne Platform
Ground based Platforms:
Ground based platforms are used to record detailed information about the objects or
features of the earth’s surface
These are developed for the scientific understanding on the signal-object and signal-
sensor interactions.
It includes both the laboratory and field study, used for both in designing sensors and
identification and characterization of land features.
Example: Handheld platform, cherry picker, towers, portable masts and vehicles etc.
www.EnggTree.com
Portable handheld photographic cameras and spectroradiometers are largely used in
laboratory and field experiments as a reference data and ground truth verification.
Crane, Ground based platform (cherry Picker Platform extend up to approx. 15m. )
Air- borne/ based Platforms:
Airborne platforms were the sole non-ground-based platforms for early remote
sensing work.
Aircraft remote sensing system may also be referred to as sub-orbital or airborne, or
aerial remote sensing system
At present, airplanes are the most common airborne platform.
observation platforms include balloons, drones (short sky spy) and high altitude
sounding rockets. Helicopters are occasionally used.
Balloons:
Balloons are used for remote sensing observation (aerial photography) and nature
conservation studies.
The first aerial images were acquired with a camera carried aloft by a balloon in 1859.
Balloon floats at a constant height of about 30 km.
Balloons as platforms are not very expensive like aircrafts. They have a great variety
of shapes, sizes and performance capabilities.
The balloons have low acceleration, require no power and exhibit low vibrations.
It consists of a rigid circular base plate for supporting the entire sensor system which
is protected by an insulating and shock proof light casing.
The payload used for Indian balloon experiment of three Hasselblad cameras with
different film filter combinations, to provide PAN, infra red black and white and infra
red false color images.
Flight altitude being high compared to normal aircraft height used for aerial survey,
balloon imagery gives larger synoptic views.
The balloon is governed by the wind at the floating altitude
There are three main types of balloon systems, viz. free balloons, Tethered balloons
and Powered Balloons.
Free balloons can reach almost top of the atmosphere; hence, they can provide a
platform at intermediate altitude between those of aircraft and spacecraft (shown in
fig.)
Have altitude range of 22-40 km and can be used to a limited extent as a platform.
Drone: www.EnggTree.com
Drone is a miniature remotely piloted aircraft.
It is designed to fulfill requirements for a low cost platform, with long endurance,
moderate payload capacity and capability to operate without a runway or small
runway.
Drone includes equipment of photography, infrared detection, radar observation and
TV surveillance. It uses satellite communication link.
An onboard computer controls the payload and stores data from different sensors and
instruments.
Aircraft Platform:
Aircraft are used to collect very detailed images.
Helicopters can be for pinpoint locations but it vibrates and lacks stability.
Special aircraft with cameras and sensors on vibration less platforms are traditionally
used to acquire aerial photographs and images of land surface features.
While low altitude aerial photography results in large scale images providing detailed
information on the terrain, the high altitude smaller scale images offer advantage to
cover a larger study area with low spatial resolution
Aircraft platforms offer an economical method of remote sensing data collection for
small to large study areas with cameras, electronic imagers, across- track and along-
track scanners, and radar and microwave scanners.
Low Altitude Aircraft: It is most widely used and generally operates below 30,000 ft.
It is suitable for obtaining image data for small areas having large scale
High altitude aircraft: It includes jet aircraft with good rate of climb, maximum speed,
and high operating ceiling. It acquires imagery for large areas
Rockets as Platforms:
High altitude sounding rocket platforms are useful in assessing the reliability of the
remote sensing techniques as regards their dependence on the distance from the target
is concerned.
Balloons have a maximum altitude of approximately 37 km, while satellites cannot
orbit below 120 km. High altitude sounding rockets can be used to a moderate altitude
above terrain
Synoptic imagery can be obtained from rockets for areas of some 500,000 square km.
Space-borne/ based Platforms:
In space- borne remote sensing, sensors are mounted on-board a spacecraft (space
shuttle or satellite) orbiting the earth.
www.EnggTree.com
Space-borne or satellite platform are onetime cost effected but relatively lower cost
per unit area of coverage, can acquire imagery of entire earth without taking
permission.
Space-borne imaging ranges from altitude 250 km to 36000 km.
Space-borne remote sensing provides the following advantages:
Large area coverage;
Frequent and repetitive coverage of an area of interest;
Quantitative measurement of ground features using radiometrically calibrated sensors;
Semi-automated computerised processing and analysis;
Relatively lower cost per unit area of coverage.
Spacecraft as Platform:
Remote sensing is also conducted from the space shuttle or artificial satellites.
Artificial satellites are manmade objects, which revolve around another object.
Satellite can cover much more land space than planes and can monitor areas on a
regular basis.
Later on with LANDSAT and SPOT satellites program, space photography received a
higher impetus
ELECTROMAGNETIC SPECTRUM
The first requirement for remote sensing is to have an energy source to illuminate the target
(unless thesensed energy is being emitted by the target). This energy is in the form of
electromagnetic radiation. All electromagnetic radiation has fundamental properties and
behaves in predictable ways according to the basicsof wave theory.
Wavelength
The wavelength is the length of one wave cycle, which can be measured as the distance
between successive wave crests. Wavelength is usually represented by the Greek letter
lambda (λ). Wavelength is measured in metres (m) or some factor of metres such as
nanometres (nm, 10- 9 metres), micrometres (μm, 10-6 metres) (μm, 10-6 metres) or
centimetres (cm, 10-2 metres). Frequency refers to the number of cycles of a wave passing a
fixed point per unit of time. Frequency is normally measured in hertz (Hz), equivalent to one
cycle per second, and various multiples of hertz.
Therefore, the two are inversely related to each other. The shorter the wavelength, the higher
the frequency. The longer the wavelength, the lower the frequency. Understanding the
characteristics of electromagnetic radiation in terms of their wavelength and frequency is
crucial to understanding the information to be extracted from remote sensing data.
The electromagnetic spectrum ranges from the shorter wavelengths (including gamma and x-
rays) to the longer wavelengths (including microwaves and broadcast radio waves). There are
several regions of the electromagnetic spectrum which are useful for remote sensing.
www.EnggTree.com
www.EnggTree.com
Ultraviolet or UV
For the most purposes ultraviolet or UV of the spectrum shortest wavelengths are practical for
remote sensing. This wavelength beyond the violet portion of the visible wavelengths hence it
name. Some earth surface materials rocks and materials are emit visible radiation when
illuminated by UV radiation.
Visible Spectrum
The light which our eyes - our "remote sensors" - can detect is part of the visible spectrum. It
is important to recognize how small the visible portion is relative to the rest of the spectrum.
There is a lot of radiation around us which is "invisible" to our eyes, but can be detected by
other remote sensing instruments and used to our advantage. The visible wavelengths cover a
range from approximately 0.4 to 0.7 μm. The longest visible wavelength is red and the
shortest is violet. Common wavelengths of what we perceive as particular colours from the
visible portion of the spectrum are listed below. It is important to note that this is the only
portion of the spectrum we can associate with the concept of colors.
Blue, green, and red are the primary colours or wavelengths of the visible spectrum. They
are defined as such because no single primary colour can be created from the other two, but
all other colours can be formed by combining blue, green, and red in various proportions.
Although we see sunlight as a uniform or homogeneous colour, it is actually composed of
www.EnggTree.com
various wavelengths of radiation in primarily the ultraviolet, visible and infrared portions of
the spectrum. The visible portion of this radiation can be shown in its component colours
when sunlight is passed through a prism, which bends the light in differing amounts
according to wavelength.
Infrared (IR)
The next portion of the spectrum of interest is the infrared (IR) region which covers the
wavelength range from approximately 0.7 μm to 100 μm more than 100 times as wide as the
visible portion. The infrared can be divided into 3 categories based on their radiation
properties-the reflected near- IR middle IR and thermal IR.
The reflected near IR covers wavelengths from approximately 0.7 μm to 1.3 μm is
commonly used to expose black and white and color-infrared sensitive film.
The middle-infrared region includes energy with a wavelength of 1.3 to 3.0 μm.
The thermal IR region is quite different than the visible and reflected IR portions, as this
energy is essentially the radiation that is emitted from the Earth's surface in the form of heat.
The thermal IR covers wavelengths from approximately 3.0 μm to 100 μm.
Microwave
www.EnggTree.com
E is the photonic energy in Joules, h is Planks constant and f is the frequency in Hz.
PARTICAL THEORY
The basic idea of quantum theory is that radiant energy is transmitted in indivisible packets
whose energy is given in integral parts, of size hv, where h is Planck's constant =
6.6252 x 10-34 J - s, and v is the frequency of the radiation. These are called quanta or
photons.
The dilemma of the simultaneous wave and particle waves of electromagnetic energy may be
conceptually resolved by considering that energy is not supplied continuously throughout a
wave, but rather that it is carried by photons. The classical wave theory does notgive the
intensity of energy at a point in space, but gives the probability of finding a photon atthat
point. Thus the classical concept of a wave yields to the idea that a wave simply describes the
probability path for the motion of the individual photons.
The particular importance of the quantum approach for remote sensing is thatit provides the
concept of discrete energy levels in materials. The values and arrangement of these levels are
different for different materials. Information about a given material is thus available in
electromagnetic radiation as a consequence of transitions between these energy levels. A
transition to a higher energy level is caused by the absorption of energy, or from a higher to a
www.EnggTree.com
lower energy level is caused by the' emission of energy. The amounts of energy either
absorbed or emitted correspond precisely to the energy difference between the two levels
involved in the transition. Because the energy levels are different for each material, the
amount of energy a particular substance can absorb or emit is different for that material from
any other materials. Consequently, the position and intensities of the bands in the spectrum of
a given material are characteristic to that material.
STEFAN–BOLTZMANN LAW
Stefan–Boltzmann law, also known as Stefan's law, describes the power radiated from a
black body in terms of its temperature. Specifically, the Stefan–Boltzmann law states that the
total energy radiated per unit surface area of a black body across all wavelengths per
unit time (also known as the black-body radiant exitance or emissive power), , is
directly proportional to the fourth power of the black body's thermodynamic temperature T:
E LAW
WIEN'S DISPLACEMENT
Wien's displacement law states that the black body radiation curve for different temperatures
peaks at a wavelength inversely proportional to the temperature. The shift of
that peak of the Planck radiation
u law which describes the spectral brightness of black
body radiation as a function of wavelength at any given temperature.
However it had discovered by Wilhelm Wien several years before Max
been
www.EnggTree.com
Planck developed that more general equation, and describes the entire shift of the spectrum of
black body radiation toward shorter wavelengths as temperature increases.
Formally, Wien's displacement law states that the spectral radiance of black body radiation
per unit wavelength, peaks at the wavelength λmax given by:
www.EnggTree.com
From Fig. 4, it can be observed that the peak of the radiant exitance varies with wavelength. As
the temperature increases, the peak shifts towards the left. This is explained by the Wien‘s
displacement law. It states that the dominant wavelength at which a black body radiates λm is
Inversely proportional to the absolute temperature of the black body (in K) and is represented as
given below.
www.EnggTree.com
Fig 2.1 Energy Interaction with Atmosphere
SCATTERING
Scattering occurs when particles or large gas
molecules present in the atmosphere interact with and cause the electromagnetic radiation
to be redirected from its original path. How much scattering takes place dependson several
factors including the wavelength of the radiation, the abundance of particles or gases,
and the distance the radiation travels through the atmosphere. Thereare three (3) types of
scattering which take place.
RAYLEIGH SCATTERING
Rayleigh scattering occurs when particles are very small compared to the wavelength of the
radiation. These could bearticles such as small specks of dust or nitrogenand oxygen
molecules. Rayleigh scattering causes shorter wavelengths of energy to be scattered much
more than longer wavelengths. Rayleigh scattering is the dominant scattering mechanism in
the upper atmosphere. The fact that the sky appears "blue" during theday is because of this
phenomenon. As sunlight passes through the atmosphere, the shorter wavelengths (i.e. blue)
of the visible spectrum are scattered more than the other (longer) visible wavelengths. At
sunrise and sunset the light has to travel farther through the atmosphere than at midday and
the scattering of the shorter wavelengths is more complete; this leaves a greater proportion of
the longer wavelengths to penetrate the atmosphere.
www.EnggTree.com
Absorption is the other main mechanism at work when electromagnetic radiation interacts
with the atmosphere. In contrast to scattering, this phenomenon causes molecules in the
atmosphere to absorb energy at various wavelengths. Ozone, carbon dioxide, and water vapor
are the three main atmospheric constituents which absorb radiation. Ozone serves to absorb
the harmful (to
most living things) ultraviolet radiation for the sun. Without this protective layer in the
atmosphere our skin would burn when exposed to sunlight. Carbon dioxide referred to as a
greenhouse gas. This is because it tends to absorb radiation strongly in the far infrared portion
of the spectrum - that area associated with thermal heating - which serves to trap this heat
inside the atmosphere. Water vapour in the atmosphere absorbs much of the incoming
longwave infrared and shortwave microwave radiation (between 22μm and 1m). The presence
of water vapour in the lower atmosphere varies greatly from location to location and at
different times of the year. For example, the air mass above a desert would have very little
water vapour to absorb energy, while the tropics would have high concentrations of water
vapour (i.e. high humidity).
MIE SCATTERING
Mie scattering occurs when the particles are just about the same size as the wavelength of the
radiation. Dust, pollen, smoke and water vapour are common causes of Mie scattering which
tends to affect longer wavelengths than those affected by Rayleigh scattering. Mie scattering
occurs mostly in the lower portions of the atmosphere where larger particles are more
abundant, and dominates when cloud conditions are overcast.
www.EnggTree.com
The final scattering mechanism of importance is called nonselective scattering. This occurs
when the particles are much larger than the wavelength of the radiation.
Water droplets and large dust particles can cause this type of scattering. Nonselective
scattering gets its name from the fact that all wavelengths are scattered about equally. This
type of scattering causes fog and clouds to appear white to our eyes because blue, green, and
red light are all scattered in approximately equal quantities (blue+green+red light = white
light).
ATMOSPHERIC WINDOWS
While EMR is transmitted from the sun to the surface of the earth, it passes through the
atmosphere. Here, electromagnetic radiation is scattered and absorbed by gases and dust
particles. Besides the major atmospheric gaseous components like molecular nitrogen and
oxygen, other constituents like water vapour, methane, hydrogen, helium and nitrogen
compounds play important role in modifying electro magnetic radiation. This affects image
quality. Regions of the electromagnetic spectrum in which the atmosphere is transparent are
called atmospheric windows. In other words, certain spectral regions of the electromagnetic
radiation pass through the atmosphere without much attenuation are called atmospheric
windows. The atmosphere is practically transparent in the visible region of the
electromagnetic spectrum and therefore, many of the satellite based remote sensing sensors
are designed to collect data in this region. Some of the commonly used atmospheric windows
are shown in the figure.
Figure . They are: 0.38-0.72 microns (visible), 0.72-3.00 microns (near infra-red and middle
infra-red), and 8.00-14.00 microns (thermal infra-red).
Transmission100%UVVisibleInfraredEnergy Blocked0.3 Wavelength (microns)1101001 mm
The characteristic spectral reflectance curve Fig2.3 for water shows that from about 0.5µm, a
reduction in reflectance with increasing wavelength, so that in the near infrared range, the
reflectance of deep, clear water is virtually a zero (Mather, 1987). However, the spectral
reflectance of water is significantly affected by the presence of dissolved and suspended
organic and inorganic material and by the depth of the water body. Fig. 1.8 shows the spectral
reflectance curves for visible and near-infrared wavelengths at the surface and at 20 m depth.
Suspended solids
in water scatter the down welling radiation, the degree of scatter being proportional to the
concentration and the color of the sediment. Experimental studies in the field and in the
laboratory as well as experience with multispectral remote sensing have shown that the
specific targets are characterized by an individual spectral response. Indeed, the successful
development of remote sensing of environment over the past decade bears witness to its
validity. In the remaining part of this section, typical and representative spectral reflectance
curves for characteristic types of the surface materials are considered. Imagine a beach on a
beautiful tropical island. of electromagnetic radiation with the top layer of sand grains on the
beach. When an incident ray of electromagnetic radiation strikes an air/grain interface, part of
the ray is reflected and part of it is transmitted into the sand grain. The solid lines in the
figure represent the incident rays, and dashed lines 1, 2, and 3 represent rays reflected from
the surface but have never penetrated a sand grain. The latter are called specular rays by
Vincent and Hunt (1968), and surface-scattered rays by Salisbury and Wald
(1992); these rays result from first-surface reflection from all grains encountered. For a given
www.EnggTree.com
reflecting surface, all specular rays reflected in the same direction, such that the angle of
reflection (the angle between the reflected rays and the normal, or perpendicular to the
reflecting surface) equals the angle of incidence (the angle between the incident rays and the
surface normal). The measure of how much electromagnetic radiation is reflected off a
surface is called its reflectance, which is a number between 0 and 1.0. A measure of 1.0
means the 100% of the incident radiation is reflected off the surface, and a measure of 0
means that 0% is reflected.
ENERGY INTERACTIONS WITH EARTH SURFACE FEATURES
Energy incident on the Earth‘s surface is absorbed, transmitted or reflected depending on the
wavelength and characteristics of the surface features (such as barren soil, vegetation, water
body). Interaction of the electromagnetic radiation with the surface features is dependent on
the characteristics of the incident radiation and the feature characteristics. After interaction
with the surface features, energy that is reflected or re-emitted from the features is recorded at
the sensors and are analysed to identify the target features, interpret the distance of the object,
and /or its characteristics.
This lecture explains the interaction of the electromagnetic energy with the Earth‘s surface
features.
Energy Interactions
The incident electromagnetic energy may interact with the earth surface features in three
possible ways: Reflection, Absorption and Transmission. These three interactions are
Reflection Absorption Earth Transmission
Incident radiation
Reflection occurs when radiation is redirected after hitting the target. According to the law of
reflection, the angle of incidence is equal to the angle of reflection the EM energy which is
absorbed by the Earth‘s surface is available for emission and as thermal radiation at longer
wavelengths.
Transmission occurs when radiation is allowed to pass through the target. Depending upon
the characteristics of the medium, during the transmission velocity and wavelength of the
radiation changes, whereas the frequency remains same. The transmitted energy may further
get scattered and / or absorbed in the medium.
These three processes are not mutually exclusive. Energy incident on a surface may be
partially reflected, absorbed or transmitted. Which process takes place on a surface depends
on the following factors:
Wavelength of the radiation
www.EnggTree.com
Angle at which the radiation intersects the surface
Composition and physical properties of the surface
The relationship between reflection, absorption and transmission can be expressed through
the principle of conservation of energy. Let EI denotes the incident energy, ER denotes the
reflected energy, EA denotes the absorbed energy and ET denotes the transmitted energy.
Then the principle of conservation of energy (as a function of wavelength λ) can be expressed
as
EI (λ) = ER (λ) + EA (λ) + ET (λ) (1)
Since most remote sensing systems use reflected energy, the energy balance relationship can
be better expressed in the form
ER (λ) = EI (λ) - EA (λ) - ET (λ) (2)
The reflected energy is equal to the total energy incident on any given feature reduced by the
energy absorbed or transmitted by that feature.
Reflection
Reflection is the process in which the incident energy is redirected in such a way that the angle
of incidence is equal to the angle of reflection. The reflected radiation leaves the surface at the
When electromagnetic energy is incident on the surface, it may get reflected or scattered
depending upon the roughness of the surface relative to the wavelength of the incident
energy. If the roughness of the surface is less than the wavelength of the radiation or the ratio
of roughness to wavelength is less than 1, the radiation is reflected. When the ratio is more
than 1 or if the roughness is more than the wavelength, the radiation is scattered.
Fraction of energy that is reflected / scattered is unique for each material. This will aid in
distinguishing different features on an image
A feature class denotes distinguishing primitive characteristic or attribute of an image that
have been classified to represent a particular land cover type/spectral signature. Within one
feature class, the proportion of energy reflected, emitted or absorbed depends on the
wavelength. Hence, in spectral range two features may be indistinguishable; but their
reflectance properties may be different in another spectral band. In multi-spectral remote
sensing, multiple sensors are used to record the reflectance from the surface features at
www.EnggTree.com
different wavelength bands and hence to differentiate the target features.
Variations in the spectral reflectance within the visible spectrum give the colour effect to the
features.
For example, blue colour is the result of more reflection of blue light. An object appears
as ―green‖ when it reflects highly in the green portion of the visible spectrum. Leaves appear
green since its chlorophyll pigment absorbs radiation in the red and blue wavelengths but
reflects green wavelengths. Similarly, water looks blue-green or blue or green if viewed
through visible band because it reflects the shorter wavelengths and absorbs the longer
wavelengths in the visible band. Water also absorbs the near infrared wavelengths and hence
appears darker when viewed through red or near infrared wavelengths. Human eye uses
reflected energy variations in the visible spectrum to discriminate between various features.
For example, shows a part of the Krishna River Basin as seen in different bands of the
Landsat ETM+ imagery. As the concepts of false colour composite (FCC) have been covered
in module 4, readers are advised to refer to the material in module 4 for better understanding
of the colour composite imageries as shown in Fig. 5. Reflectance of surface features such as
water, vegetation and fallow lands are
different in different wavelength bands. A combination of more than one spectral band helps
Specular reflection: It occurs when the surface is smooth and flat. A mirror-like or smooth
reflection is obtained where complete or nearly complete incident energy is reflected in one
direction. The angle of reflection is equal to the angle of incidence. Reflection from the
surface is the maximum along the angle of reflection, whereas in any other direction it is
negligible.
Diffuse (Lambertian) reflection: It occurs when the surface is rough. The energy is reflected
uniformly in all directions. Since all the wavelengths are reflected uniformly in all directions,
diffuse reflection contains spectral information on the "color" of the reflecting surface. Hence,
in remote sensing diffuse reflectance properties of terrain features are measured. Since the
www.EnggTree.com
reflection is uniform in all direction, sensors located at any direction record the same
reflectance and hence it is easy to differentiate the features.
Based on the nature of reflection, surface features can be classified as specular
reflectors, Lambertian reflectors. An ideal specular reflector completely reflects the incident
energy with angle of reflection equal to the angle incidence. An ideal Lambertian or diffuse
reflector scatters all the incident energy equally in all the directions.
The specular or diffusive characteristic of any surface is determined by the roughness of the
surface in comparison to the wavelength of the incoming radiation. If the wavelengths of the
incident energy are much smaller than the surface variations or the particle sizes, diffuse
reflection will dominate. For example, in the relatively long wavelength radio range, rocky
terrain may appear smooth to incident energy. In the visible portion of the spectrum, even a
material such as fine sand appears rough while it appears fairly smooth to long wavelength
microwaves.
Most surface features of the earth are neither perfectly specular nor perfectly diffuse
reflectors. In near specular reflection, though the reflection is the maximum along the angle
of reflection, a fraction of the energy also gets reflected in some other angles as well. In near
Lambertian reflector, the reflection is not perfectly uniform in all the directions. The
characteristics of different types of reflectors are
Near diffusive Near specular Ideal diffusive Ideal specular Angle of reflection Angle of
incidence
Lambertian reflectors are considered ideal for remote sensing. The reflection from an
ideal Lambertian surface will be the same irrespective of the location of the sensor. On the
other hand, in case of an ideal specular reflector, maximum brightness will be obtained only
at one location and for the other locations dark tones will be obtained from the same target.
This variation in the spectral signature for the same feature affects the interpretation of the
remote sensing data.
Most natural surfaces observed using remote sensing are approximately Lambertian at
visible and IR wavelengths. However, water provides specular reflection. Water generally
gives a dark tone in the image. However due to the specular reflection, it gives a pale tone
when the sensor is located in the direction of the reflected energy.
Spectral Reflectance of Earth Surface
Vegetation
In general, healthy vegetation is a very good absorber of electromagnetic energy in the visible
www.EnggTree.com
region. Chlorophyll strongly absorbs light at wavelengths around 0.45 (blue) and 0.67 µm
(red) and reflects strongly in green light, therefore our eyes perceive healthy vegetation as
green. Healthy plants have a high reflectance in the near-infrared between 0.7 and 1.3 µm.
This is primarily due to healthy internal structure of plant leaves. As this internal structure
varies amongst different plant species, the near infrared wavelengths can be used to
discriminate between different plant species.
Water
In its liquid state, water has relatively low reflectance, with clear water having the greatest
reflectance in the blue portion of the visible part of the spectrum. Water has high absorption
and virtually no reflectance in near infrared wavelengths range and beyond. Turbid water has
a higher reflectance in the visible region than clear water. This is also true for waters
containing high chlorophyll concentrations.
Ice and snow generally have high reflectance across all visible wavelengths, hence their
bright white appearance. Reflectance decreases in the near infrared portion and there is
very low
reflectance in the SWIR (shortwave infrared). The low reflection of ice and snow in the
SWIR is related to their microscopic liquid water content. Reflectance differs for snow and
ice depending on the actual composition of the material including impurities and grain size.
Soil
Bare soil generally has an increasing reflectance, with greater reflectance in near-infrared and
shortwave infrared. Some of the factors affecting soil reflectance are:
Moisture content
Surface roughness
www.EnggTree.com
www.EnggTree.com
www.EnggTree.com
The Earth's atmosphere is divided into several main regions, each with distinct characteristics
in terms of temperature, pressure, composition, and other properties.
1. Troposphere:
Altitude: 0 to approximately 8-15 kilometers (0 to 5-9 miles).
Characteristics:
Decreasing temperature with altitude.
Where weather events, including clouds and precipitation, occur.
Contains the majority of the Earth's atmospheric mass.
2. Stratosphere:
Altitude: Approximately 15 to 50 kilometers (9 to 31 miles).
Characteristics:
Temperature generally increases with altitude due to the presence of
the ozone layer.
Contains the ozone layer, which absorbs and scatters ultraviolet (UV)
solar radiation.
Jet streams are found in the upper part of this layer.
3. Mesosphere:
Altitude: Approximately 50 to 85 kilometers (31 to 53 miles).
Characteristics:
Decreasing temperature with altitude.
The region where meteorites burn up upon entering the Earth's
atmosphere.
Thermospheric temperatures decrease with altitude.
4. Thermosphere:
www.EnggTree.com
Altitude: Approximately 85 kilometers and extends upward to about 600
kilometers (53 miles to about 373 miles).
Characteristics:
Temperature increases significantly with altitude due to the absorption
of high-energy solar radiation.
Extremely low pressure and density.
The region where auroras occur.
5. Exosphere:
Altitude: Beyond 600 kilometers (373 miles) and extends into space.
Characteristics:
Gradual transition to the vacuum of space.
Very low density of gas particles.
Satellites orbit in this region.
One of the two remaining processes that influence electromagnetic radiation as it passes
through the atmosphere is scattering. Scattering happens when a photon interacts with
something in the atmosphere that causes it to change direction. Depending on the size of the
object that the photon interacts with, two distinct types of scattering are
recognized. Rayleigh scattering happens when the object is much smaller than the wavelength
of the radiation. In the case of sunlight and the Earth’s atmosphere this means that Rayleigh
scattering is cause by atmospheric gasses like N 2, O2, CO2 etc. Mie scattering happens when
www.EnggTree.com
the object is similar in size to the wavelength of the radiation, which means that it is caused
by aerosols like smoke and dust particles. Additional scattering can happen if radiation
interacts with particles larger in size than its wavelength, like water droplets or sand particles.
While refraction is predictable and can be determined by Snell’s Law, scattering is an
inherently stochastic process: what happens to an individual photon as it passes through the
atmosphere is entirely unpredictable, including whether or not it experiences any scattering,
and if so which direction it is reemitted in. However, the magnitude and direction of
scattering that happens on average to the many photons in a radiation field is predictable.
Rayleigh scattering
A fact that has great importance for remote sensing of the Earth is that the magnitude of
Rayleigh scattering is inversely related to the 4th power of the wavelength of the radiation. In
other words, radiation with shorter wavelengths is scattered much more by Rayleigh
www.EnggTree.com
scattering than radiation at longer wavelengths. In the visible wavelengths, this means that
blue light is scattered more than green light, which in turn is scattered more than red light.
This is the process that makes the Earth’s oceans look blue when viewed from space. What
happens is that over very dark Earth surfaces, such as the oceans, the majority of radiation
reaching the Earth surface is absorbed rather than reflected by it. What is visible from space
is thus not radiation reflected by the surface, but rather radiation scattering from within the
atmosphere. Because blue wavelengths are those most strongly scattered through Rayleigh
scattering, this scattered radiation as a whole looks blue (Figure 23). Another effect of
Rayleigh scattering is that regardless of what is on the Earth’s surface, a space-based sensor
will detect a substantial amount of blue light coming from the Earth-Atmosphere system.
This can be a problem because the ‘blue signal’ form the atmosphere overwhelms variations
in ‘blue reflectance’ on the surface. But it can also be an advantage, because measurements in
the blue wavelengths can help assess the strength of Rayleigh scattering across the visible and
infrared spectrum, which can in turn be corrected for. This is the basis for the ‘aerosol’ band
that was included on Landsat 8 OLI (but was not found on its predecessor instruments), on
Sentinel-2, and on the WorldView-2 and -3 sensors.
.
While any scattering in the atmosphere is a source of noise (for those interested in using
satellite imagery to characterize the Earth’s surface), Rayleigh scattering is a relatively
benign source of noise because its wavelength dependence makes it largely predictable, and
because the gasses responsible for it tend to have stable concentrations across space and time.
Rayleigh scattering is therefore not a source of great uncertainty for most remote sensing
applications.
Mie scattering
Mie scattering, because its strength and wavelength dependence depends on the type and
density of the particulates that cause it to happen, varies substantially through time and space.
As a result it is one of the most important causes of uncertainty in remote sensing, especially
when using satellite data to study dark parts of the Earth’s surface from which the amount of
reflected radiation is small relative to the total signal from atmospheric scattering. For the
same reason it is hard to generalize its importance, but broadly speaking the strength of Mie
scattering exceeds that of Rayleigh scattering, and while it still diminishes with increasing
wavelength its influence extends further into the infrared spectrum. Because Mie scattering is
caused by atmospheric particulates, it is often dramatically increased during dust storms,
forest fires, or other events that caused the atmospheric aerosol load to increase. One such
example is seen in Figure 24. www.EnggTree.com
www.EnggTree.com
Absorption
The last important thing that happens to electromagnetic radiation as it passes through the
atmosphere is that it is partially absorbed by atmospheric gasses (mostly H 2O, CO2 and O3).
While the energy absorbed is ultimately re-emitted by these gas molecules, the re-emission
happens at wavelengths typically outside the spectrum considered in optical remote sensing
(but which may be important for thermal remote sensing), so for practical purposes the
absorbed photons can be considered gone when absorbed. The strength of absorption is
highly dependent on wavelength because it happens most easily when the radiation has a
wavelength (frequency) that is similar to a resonant frequency of the gas doing the
absorption, which in turn depends on its atomic or molecular structure. For example, due to
its molecular structure, O2 is particularly good at absorbing electromagnetic radiation with
wavelengths right around 760 nm, but not at 750 or 770 nm. Similar wavelengths exist at
which other gasses are effective or not at absorbing EMR, and in combination the
atmospheric gasses let some wavelengths pass through the atmosphere with almost no
absorption, while other wavelengths are almost entirely absorbed before they reach the
surface of the Earth (Figure 25 and Figure 26). As is especially clear in Figure 26, water
vapour is responsible for much of the total gaseous absorption of EMR in the atmosphere,
including in the visible spectrum (not clearly shown on that figure). This is an important
challenge for remote sensing because while the concentrations of the other gasses are
relatively stable through time and space, water vapour concentrations vary greatly through
time (humid vs. dry days) and through space (dry arctic vs. humid tropical
www.EnggTree.com
defines what proportion of radiation is reflected for each wavelength. For example, water
reflects a small amount of blue and green wavelengths (typically around 5% – 10%
depending on turbidity), less of the red wavelengths, and almost nothing in the infrared
wavelengths. Vegetation, on the other hand, reflected around half of all incoming infrared
radiation, except for specific wavelengths that are effectively absorbed by liquid water in the
leaves. These spectral signatures are commonly portrayed as graphs, with wavelengths along
the x-axis and reflectance along the y-axis (as in Figure 27).
Spectral signatures are what enables us to differentiate between different materials on the
Earth’s surface when we look at a satellite image. As shown in Figure 27, water has near-zero
reflectance at wavelengths longer than 0.7 μm (700 nm), while both soil and green vegetation
has reflectances around 40% at 1.3 μm. Measuring the amount of radiation reflected off the
Earth-Atmosphere system at 1.3 μm will thus be particularly helpful at differentiating water
from the two terrestrial surface types. Similarly, measurements at wavelengths around 1.4 μm
(where liquid water in vegetation is a strong absorber) or 1.9 μm (same) can be effective to
www.EnggTree.com
differentiate between soil and green vegetation.
As a more detailed example, spectral signatures have been effective for large-scale geological
surveying/prospecting because different minerals (that may be characteristic of different sub-
surface conditions) can be identified through their unique spectral signatures (Figure 28).
The part of the radiation field that is reflected by the Earth’s surface must naturally make its
way back up through the atmosphere, with the attendant refraction, scattering and absorption,
before it can be measured by any space-based sensor. While there are many relative
advantages and disadvantages to air-borne vs. space-borne sensors, the ability of air-borne
sensors to measure the reflected EMR field before it has had to pass through the atmosphere a
second time is one distinct advantage.
Atmospheric Windows:
"atmospheric windows" refer to specific wavelength ranges in the electromagnetic spectrum
where the Earth's atmosphere is relatively transparent, allowing certain types of
electromagnetic radiation to pass through with minimal absorption or interference. These
windows are crucial for observations and measurements from ground-based or space-based
instruments, as they enable the study of celestial objects, weather patterns, and various Earth
processes. Different atmospheric windows exist for different regions of the electromagnetic
spectrum. Some key atmospheric windows include:
www.EnggTree.com
1. Visible Light:
The atmosphere is highly transparent to visible light, allowing sunlight to
reach the Earth's surface. This transparency is essential for human vision and
for various optical observations.
2. Near-Infrared (NIR):
Σin−Σout=Δsystem
Where:
Σin represents the sum of all forms of energy entering the system.
Σout represents the sum of all forms of energy leaving the system.
Δsystem represents the change in internal energy of the system.
This equation is based on the first law of thermodynamics, which states that energy cannot be
created or destroyed; it can only change forms. The energy balance equation helps analyze
and quantify the flow of energy in a given system, whether it's a physical process, a chemical
reaction, or an environmental system.
www.EnggTree.com
Specular and diffuse reflectors:
1. Specular Reflectance in Remote Sensing:
Characteristics:
Specular reflection is associated with smooth and reflective surfaces.
Light reflects off such surfaces in a specific direction, following the
law of reflection.
Applications:
Specular reflection is significant in the study of water bodies. For
instance, the sun's reflection on a calm water surface can create
specular highlights.
It is relevant for monitoring highly reflective surfaces such as glass,
metal, or other smooth materials.
2. Diffuse Reflectance in Remote Sensing:
Characteristics:
Diffuse reflection involves the scattering of light in various directions,
typical of rough or non-reflective surfaces.
www.EnggTree.com
Definition: Spectral emittance is the ratio of the radiant exitance of a surface (emitted
radiation) at a particular wavelength to the radiant exitance of a perfect blackbody at
the same temperature.
1. Spectral Bands:
Remote sensing instruments, such as satellites or airborne sensors, are
equipped with sensors that capture electromagnetic radiation in specific bands
or ranges of the electromagnetic spectrum.
Each spectral band corresponds to a particular range of wavelengths, and the
combination of bands forms the spectral signature of a surface.
4. Feature Identification:
Spectral signatures are used to identify and discriminate between different
land cover features. For example, vegetation, water bodies, soil, and urban
areas have distinct spectral signatures.
Comparing the spectral signature of an unknown area to spectral libraries
helps in feature identification.
5. Temporal Variability:
Spectral signatures can vary over time due to seasonal changes, growth cycles,
or other environmental factors.
Monitoring temporal changes in spectral signatures is valuable for tracking
land cover dynamics and assessing environmental conditions.
Solid surface scattering in the microwave region refers to the interaction of microwave
radiation with the Earth's surface when it is predominantly composed of solid materials, such
as soil, rock, or man-made structures. Microwave remote sensing, particularly in the
microwave region of the electromagnetic spectrum, has applications in fields like radar
imaging, soil moisture estimation, and geological studies. Here are some key points related to
solid surface scattering in the microwave region:
1. Surface Roughness:
The interaction between microwaves and a solid surface is influenced by the
roughness of the surface.
In the microwave region, rough surfaces can cause scattering of the incident
radiation in various directions.
2. Frequency Dependence:
The behavior of solid surface scattering depends on the frequency of the
microwaves.
Higher frequency microwaves tend to interact more with the surface
roughness, leading to increased scattering effects.
3. Radar Cross Section (RCS):
The Radar Cross Section is a measure of how well a target reflects radar
signals.
Solid surfaces with irregularities or roughness can have a significant impact on
the RCS, influencing the detectability of objects.
4. Vegetation and Dielectric Properties:
www.EnggTree.com
In areas with vegetation cover, the interaction of microwaves with leaves and
branches can also contribute to scattering.
The dielectric properties of the solid materials play a role in determining how
microwaves penetrate or interact with the surface.
5. Soil Moisture Sensing:
Microwave remote sensing is often used to estimate soil moisture content, as
the interaction between microwaves and the soil surface is influenced by its
moisture content.
Moisture affects the dielectric properties of soil, impacting the scattering and
absorption of microwaves.
6. Geological Applications:
In geological studies, microwave remote sensing can be used to analyze the
composition and structure of solid surfaces.
Differences in the microwave response can help identify geological features
and material types.
7. Synthetic Aperture Radar (SAR):
Synthetic Aperture Radar is a type of radar used in microwave remote sensing.
www.EnggTree.com
UNIT 3
ORBITS AND PLATFORMS
INTRODUCTION
Remote sensing has revolutionized our ability to observe and understand the Earth's surface
and atmosphere from afar. By utilizing various platforms and orbits, remote sensing
technologies enable us to gather valuable data for a wide range of applications, including
environmental monitoring, disaster management, urban planning, agriculture, and climate
studies. In this introduction, we will explore the fundamentals of orbits and platforms used in
remote sensing and their significance in acquiring high-quality data.
Orbits
Orbits play a critical role in remote sensing missions as they determine the trajectory and
coverage of the satellite or sensor system.
TYPES
1. Low Earth Orbit (LEO)
2. Geostationary Orbit (GEO)
3. Polar Orbit
4. Sun-Synchronous Orbit (SSO)
www.EnggTree.com
Low Earth Orbit (LEO): Satellites in LEO typically orbit at altitudes ranging from 160 to
2,000 kilometers above the Earth's surface. These orbits provide high spatial resolution
imagery and frequent revisits to specific locations due to their relatively short orbital periods.
Geostationary Orbit (GEO): Satellites in GEO orbit at an altitude of approximately 35,786
kilometers above the equator. They maintain a fixed position relative to the Earth's surface,
making them ideal for continuous monitoring of specific regions, such as weather patterns
and environmental changes.
Polar Orbit: Polar orbiting satellites pass over the Earth's poles, providing global coverage
with each orbit. These orbits are commonly used for environmental monitoring, as they allow
for comprehensive observations of land, oceans, and atmosphere over time.
Sun-Synchronous Orbit (SSO): Satellites in SSO maintain a constant angle relative to the Sun
as they orbit the Earth, ensuring consistent lighting conditions during each pass over the same
area. This orbit is particularly useful for monitoring changes in vegetation, land use, and
other surface features.
Platforms:
Remote sensing platforms encompass a variety of vehicles or devices used to carry sensors
into the Earth's atmosphere or space. Some common platforms include:
Satellites: Satellites are spacecraft placed into orbit around the Earth or other celestial bodies.
They house remote sensing instruments that capture data across different wavelengths of the
electromagnetic spectrum.
Unmanned Aerial Vehicles (UAVs): UAVs, or drones, are aircraft operated without a human
pilot onboard. They are equipped with sensors capable of capturing high-resolution imagery
and collecting data over targeted areas with flexibility and cost-effectiveness.
Aircraft: Manned aircraft equipped with remote sensing instruments are used for airborne
data collection at various altitudes. These platforms offer higher spatial resolution compared
to satellite-based sensors and can be deployed for specialized missions or rapid response
tasks.
www.EnggTree.com
Ground-Based Platforms: Ground-based sensors and observatories are stationed on the
Earth's surface or mounted on fixed structures. They provide continuous monitoring of
specific locations and contribute to validating data collected from airborne or satellite
platforms.
MOTIONS OF PLANETS AND SATELLITIES
The motions of planets and satellites play a crucial role in remote sensing applications,
influencing the positioning, coverage, and data acquisition capabilities of remote sensing
instruments. Understanding these motions is essential for optimizing the design and operation
of remote sensing missions.
Satellite Orbits:
Newton's law of gravitation governs the motion of satellites in orbit around celestial bodies,
such as the Earth. The gravitational force between the satellite and the Earth determines the
shape, size, and stability of the satellite's orbit. Remote sensing satellites rely on specific
orbits, such as polar orbits or geostationary orbits, to achieve desired coverage and revisit
times.
Trajectory Planning:
Understanding the gravitational interactions between the satellite and other celestial bodies
(e.g., the Moon, the Sun) is crucial for trajectory planning in remote sensing missions. By
accounting for gravitational forces, mission planners can optimize satellite paths, minimize
fuel consumption, and ensure accurate positioning for data acquisition.
Orbital Dynamics:
Newton's law of gravitation influences various orbital parameters, including eccentricity,
inclination, and periapsis. These parameters dictate the orbital characteristics of remote
sensing satellites, such as their altitude, orbital period, and ground track. Precise control of
these parameters is essential for achieving desired observational objectives and optimizing
data collection strategies.
Gravitational Perturbations:
Gravitational perturbations from other celestial bodies can affect satellite orbits over time.
These perturbations may cause orbital precession, nodal regression, or secular drift, which
can impact the long-term stability and operational lifespan of remote sensing missions.
www.EnggTree.com
Understanding and mitigating these effects are vital for maintaining satellite performance and
data continuity.
GRAVITATIONAL FIELD AND POTENTIAL
The gravitational field and potential are fundamental aspects of Earth's geophysical
environment that influence remote sensing measurements and data interpretation. This
concepts is crucial for accurately analyzing remote sensing data and extracting meaningful
information about the Earth's surface and subsurface features.
Remote sensing technologies, such as satellite gravimetry and airborne gravity surveys, are
used to map gravity anomalies with high spatial resolution. These data aid in geological
mapping, mineral exploration, and understanding tectonic processes.
Subsurface Characterization:
Gravitational data, when integrated with other remote sensing datasets (e.g., multispectral
imagery, radar data), can provide valuable insights into subsurface characteristics, such as
lithology, density variations, and groundwater resources.
Gravity surveys, combined with geophysical inversion techniques, enable the estimation of
subsurface properties and the delineation of geological structures, facilitating resource
exploration and environmental assessment.
ESCAPE VELOCITY
Escape velocity is a concept in physics referring to the minimum velocity an object needs to
escape the gravitational pull of a massive body, such as a planet or a moon, without being
propelled further by additional force. In the context of remote sensing, escape velocity is not
directly relevant because remote sensing typically involves objects, such as satellites or
drones, that are intentionally placed into orbit around the Earth rather than being launched
into space or escaping Earth's gravitational field entirely.
KEPLER’S LAW OF PLANETARY MOTION
Kepler's laws of planetary motion are a set of three fundamental principles describing the
www.EnggTree.com
motion of planets and other celestial bodies around the Sun. While these laws are primarily
concerned with the dynamics of celestial bodies in the solar system, they have implications
for remote sensing, particularly in the context of satellite orbits and orbital dynamics.
ELEMENTS
www.EnggTree.com
Semi-Major Axis (a):
The semi-major axis is half of the major axis of an ellipse representing the orbit. It defines the
average distance between the satellite and the center of the Earth.
Eccentricity (e):
Eccentricity measures the deviation of an orbit from a perfect circle. It ranges from 0 (circular
orbit) to 1 (highly elliptical orbit), determining the shape of the orbit.
Inclination (i):
Inclination is the angle between the orbital plane and the equatorial plane of the Earth. It
defines the orientation of the orbit relative to the Earth's rotation axis.
Right Ascension of the Ascending Node (RAAN):
RAAN is the angle measured from a reference direction (typically the vernal equinox) to the
point where the orbit crosses the equatorial plane from south to north.
Argument of Perigee (ω):
The argument of perigee is the angle measured from the ascending node to the point of
closest approach (perigee) to the Earth's surface.
True Anomaly (ν):
True anomaly is the angle measured from the perigee to the current position of the satellite,
defining its position along the orbit.
TYPES
1. Low Earth Orbit (LEO)
2. Geostationary Orbit (GEO)
3. Polar Orbit
4. Sun-Synchronous Orbit (SSO)
5. Molniya Orbit
6. Highly Elliptical Orbit (HEO)
Low Earth Orbit (LEO):
Satellites in LEO typically orbit at altitudes ranging from 160 to 2,000 kilometers above the
Earth's surface. LEOs offer high spatial resolution imagery and frequent revisits to specific
locations due to their short orbital periods.
Geostationary Orbit (GEO):
Satellites in GEO orbit at an altitude of approximately 35,786 kilometers above the equator.
They remain stationary relative to the Earth's surface, providing continuous monitoring of
specific regions, such as weather patterns.
Polar Orbit:
Polar orbiting satellites pass over the Earth's poles, providing global coverage with each orbit.
www.EnggTree.com
They are commonly used for environmental monitoring and scientific research due to their
comprehensive observational capabilities.
Sun-Synchronous Orbit (SSO):
Satellites in SSO maintain a constant angle relative to the Sun as they orbit the Earth,
ensuring consistent lighting conditions during each pass over the same area. SSOs are
suitable for monitoring changes in vegetation, land use, and climate.
Molniya Orbit:
Molniya orbits are highly elliptical orbits with high inclinations, optimized for providing
extended coverage of high-latitude regions. They are commonly used in communication and
remote sensing satellites for observing polar regions.
Highly Elliptical Orbit (HEO):
HEOs have highly elliptical shapes with apogees far from the Earth and perigees relatively
close to the planet. They are utilized for specialized missions requiring long dwell times over
specific areas, such as communication or Earth observation.
ORBITAL PERTURBATIONS AND MANEUVERS
Orbital perturbations and maneuvers are essential considerations in remote sensing missions
to ensure the stability, accuracy, and efficiency of satellite orbits for data acquisition.
Perturbations are deviations from the ideal orbital path caused by gravitational, atmospheric,
and other factors. Maneuvers involve intentional adjustments to the satellite's orbit to
compensate for perturbations or achieve specific mission objectives.
www.EnggTree.com
Orbital Perturbations:
Gravitational Perturbations:
Gravitational forces from the Earth, Moon, and other celestial bodies cause variations in the
satellite's orbit, leading to perturbations. These perturbations can include changes in orbital
eccentricity, inclination, and nodal regression over time.
Atmospheric Drag:
Satellites in low Earth orbit (LEO) experience atmospheric drag, causing their orbits to decay
gradually. This drag results from interactions with the Earth's atmosphere, particularly at
lower altitudes, and requires periodic maneuvers to maintain the satellite's altitude and orbital
parameters.
Solar Radiation Pressure:
Solar radiation exerts pressure on the satellite's surface, causing small accelerations that
affect its orbit. Solar radiation pressure perturbations can cause deviations in the satellite's
position, leading to drift over time and requiring periodic corrections.
Geopotential Variations:
Variations in the Earth's gravitational field due to uneven mass distribution (e.g., mountains,
oceans, and density variations in the Earth's interior) induce perturbations in satellite orbits.
These variations can affect orbital elements such as inclination, eccentricity, and orbital
precession.
Orbital Maneuvers:
Orbit Raising or Lowering:
Satellites in LEO may perform orbit-raising maneuvers to counteract atmospheric drag and
maintain their altitude. Conversely, orbit-lowering maneuvers can be conducted to deorbit the
satellite at the end of its operational life or to transition to a lower orbit for mission
requirements.
Plane Change Maneuvers:
Plane change maneuvers involve adjusting the satellite's inclination to align its orbital plane
with a desired ground track or to synchronize with other satellites in a constellation. These
maneuvers are useful for optimizing coverage and revisits over specific regions of interest.
Station-Keeping Maneuvers:
Station-keeping maneuvers are performed to maintain a satellite's position relative to a
specific location on the Earth's surface or to other satellites in a constellation. These
maneuvers ensure consistent coverage and facilitate continuous monitoring of target areas.
www.EnggTree.com
Collision Avoidance Maneuvers:
Satellites may perform collision avoidance maneuvers to mitigate the risk of collisions with
other space objects, such as debris or operational satellites. These maneuvers involve
adjusting the satellite's orbit to avoid potential collisions and ensure mission safety.
Orbital Resonance Adjustment:
Satellites in certain orbits, such as those in resonance with the Earth's rotation or other
celestial bodies, may require periodic adjustments to maintain resonance conditions or
prevent destabilizing effects.
Ground-Based Platforms:
Ground-based remote sensing platforms are stationary or mobile platforms located on the
Earth's surface. They include:
Fixed Observatories: These are permanent installations equipped with various sensors and
instruments for continuous monitoring of specific locations. Examples include weather
stations, flux towers, and seismic stations.
Mobile Platforms: Mobile platforms such as vehicles, boats, or drones are equipped with
remote sensing instruments and can traverse different terrains to collect data over specific
areas of interest. Mobile platforms offer flexibility and versatility in data collection.
Terrestrial LiDAR: Terrestrial LiDAR systems are ground-based laser scanning devices used
to capture high-resolution 3D data of terrain, vegetation, buildings, and infrastructure. They
are often used for mapping, urban planning, and infrastructure management.
Ground-based platforms are advantageous for their relatively low cost, ease of deployment,
and ability to collect data at high spatial resolutions. However, their coverage is limited
compared to airborne and spaceborne platforms.
www.EnggTree.com
Airborne Platforms:
Airborne remote sensing platforms operate from aircraft, helicopters, or unmanned aerial
vehicles (UAVs) and provide an intermediate level of altitude between ground-based and
spaceborne platforms. They include:
Manned Aircraft: Manned aircraft equipped with remote sensing instruments fly at various
altitudes to capture data over large areas. They are used for aerial photography, multispectral
imaging, and LiDAR mapping.
Unmanned Aerial Vehicles (UAVs): UAVs, or drones, are increasingly utilized for remote
sensing applications due to their ability to collect high-resolution data at low altitudes with
Spaceborne Platforms:
Spaceborne remote sensing platforms operate from satellites orbiting the Earth and provide a
global perspective of the planet's surface and atmosphere. They include:
www.EnggTree.com
Earth Observation Satellites: Earth observation satellites are equipped with a variety of
sensors, including optical, thermal, radar, and multispectral instruments. They orbit the Earth
at different altitudes and inclinations to capture data for various applications, including
environmental monitoring, weather forecasting, land use mapping, and disaster management.
Spaceborne LiDAR: Spaceborne LiDAR systems mounted on satellites are used to measure
the elevation of the Earth's surface with high precision. They provide valuable data for
mapping terrain, monitoring glaciers, forests, and urban areas, and assessing topographic
changes.
Spaceborne platforms offer global coverage, long-term monitoring capabilities, and access to
remote or inaccessible regions. However, they require significant investment in launch and
satellite development and have limitations in spatial resolution compared to airborne
platforms.
CLASSIFICATION OF SATELLITES
Satellites can be classified based on various criteria, including their orbits, missions, and
applications. There are two types,
1. Sun-synchronous satellites
2. Geostationary satellites. www.EnggTree.com
Sun-Synchronous Satellites:
It is also known as polar orbiting satellites, orbit the Earth in a near-polar orbit while
maintaining a consistent angle relative to the Sun. This characteristic ensures that the satellite
passes over any given point on the Earth's surface at roughly the same local solar time during
each orbit. Sun-synchronous satellites typically have the following characteristics:
Orbit: Sun-synchronous satellites typically orbit the Earth in a near-polar, low Earth orbit
(LEO) at altitudes ranging from a few hundred to a few thousand kilometers. These orbits are
inclined at an angle relative to the equator, allowing the satellite to cover different latitudes
with each orbit while maintaining a consistent solar angle.
Advantages: Sun-synchronous satellites offer several advantages for remote sensing and
Earth observation missions:
Consistent Lighting Conditions: By maintaining a consistent angle relative to the Sun, Sun-
synchronous satellites ensure uniform lighting conditions during each pass over the Earth's
surface. This consistency is critical for applications such as vegetation monitoring, land cover
mapping, and change detection.
Seasonal Coverage: Sun-synchronous orbits allow satellites to cover the entire globe over the
course of several days or weeks, providing comprehensive seasonal coverage of the Earth's
surface.
Repeat Pass Capability: Sun-synchronous satellites have a repeatable ground track, enabling
them to revisit the same locations on the Earth's surface at regular intervals. This capability is
valuable for monitoring changes over time and detecting trends in environmental phenomena.
Applications: Sun-synchronous satellites are used for a wide range of applications, including
environmental monitoring, climate studies, land use mapping, agriculture, forestry, disaster
management, and scientific research.
www.EnggTree.com
Geostationary Satellites:
Geostationary satellites, also known as geosynchronous equatorial orbit (GEO) satellites,
orbit the Earth directly above the equator at a fixed position relative to the Earth's surface.
These satellites orbit the Earth at the same rate as the Earth's rotation, resulting in a stationary
position relative to a specific point on the Earth's surface. Geostationary satellites typically
have the following characteristics:
Orbit: Geostationary satellites orbit the Earth at an altitude of approximately 35,786
kilometers above the equator. They orbit the Earth in the same direction and at the same rate
as the Earth's rotation, completing one orbit approximately every 24 hours.
Advantages: Geostationary satellites offer several advantages for communications, weather
monitoring, and other applications:
www.EnggTree.com
LEGRANGE ORBIT
The Lagrange points, also known as libration points or Lagrangian points, are positions in
space where the gravitational forces of two large bodies, such as the Earth and the Moon or
the Earth and the Sun, balance the centripetal force felt by a smaller object. There are five
Lagrange points labeled L1 through L5. While Lagrange points are not typically used for
remote sensing satellites, they can be advantageous for certain specialized missions due to
their unique orbital characteristics. Let's explore how Lagrange points could potentially be
utilized for remote sensing:
Lagrange Point 1 (L1):
L1 is located between the Earth and the Sun, directly along the line connecting their centers.
At this point, the gravitational forces of the Earth and the Sun balance out, allowing a satellite
to maintain a relatively stable position with respect to both bodies.
Advantages for Remote Sensing: A satellite positioned at L1 could provide continuous solar
observation, monitoring space weather phenomena, and providing early warning of solar
storms, which can impact satellite communications and power grids on Earth.
www.EnggTree.com
www.EnggTree.com
Based on Platform:
Satellite Sensors: Mounted on satellites orbiting the Earth, these sensors provide global
coverage and are used for various applications such as environmental monitoring, weather
forecasting, and disaster management.
www.EnggTree.com
Aerial Sensors: Mounted on aircraft or drones, these sensors provide high-resolution
imagery and are suitable for localized and rapid data collection over specific areas.
Ground-Based Sensors: Fixed or mobile sensors deployed on the ground, which are used for
specific applications such as weather monitoring, traffic monitoring, and environmental
research.
Based on Application:
www.EnggTree.com
Environmental Monitoring: Sensors used for assessing and monitoring environmental
parameters such as land cover, vegetation health, water quality, and air pollution.
Weather and Climate Monitoring: Sensors used for measuring meteorological parameters
such as temperature, humidity, precipitation, and atmospheric composition.
Defense and Security: Sensors used for surveillance, reconnaissance, and intelligence
gathering in defense and security applications.
Agriculture and Forestry: Sensors used for monitoring crop health, estimating yields,
assessing forest resources, and detecting forest fires.
RESOLUTION CONCEPT
The concept of resolution in remote sensing refers to the ability of a sensor to
distinguish between objects or features in the Earth's surface or atmosphere. It is a critical
aspect that determines the level of detail present in the imagery or data collected by the
sensor. Resolution can be classified into several types
1. Spatial Resolution
2. Spectral Resolution
3. Temporal Resolution
4. Radiometric Resolution
Spatial Resolution:
Spatial resolution refers to the size of the smallest discernible or resolvable feature in
the imagery. For optical sensors, it is typically measured in terms of meters per pixel
or centimeters per pixel on the ground.
Higher spatial resolution means smaller pixel sizes and greater detail in the imagery,
allowing for the identification of smaller objects or features.
Spatial resolution is influenced by factors such as the sensor's spatial sampling capabilities,
altitude, and optics.
Spectral Resolution:
Spectral resolution refers to the ability of a sensor to distinguish between different
wavelengths or bands of www.EnggTree.com
electromagnetic radiation.
It is determined by the number and width of the spectral bands captured by the sensor.
Sensors with higher spectral resolution can discriminate between a greater number of
spectral features, enabling more detailed analysis of surface properties such as
vegetation health, mineral composition, and land cover types.
Temporal Resolution:
Temporal resolution refers to the frequency at which a sensor revisits or acquires data
over a particular area.
It is measured in terms of the time interval between successive observations.
Sensors with higher temporal resolution provide more frequent updates of the Earth's
surface, allowing for monitoring of dynamic processes such as land cover changes,
crop growth cycles, and natural disasters.
Radiometric Resolution:
Radiometric resolution refers to the sensor's ability to detect and record variations in
the intensity or brightness of electromagnetic radiation.
It is determined by the number of bits used to represent the digital values of the
www.EnggTree.com
recorded data.
Higher radiometric resolution enables the sensor to capture subtle differences in
reflectance or emission, leading to greater sensitivity and accuracy in quantitative
analysis.
SCANNERS
In remote sensing, scanners can be classified based on the direction in which they acquire
image data relative to the platform's movement. Two common types of scanners are
1. Along-track scanners
2. Across-track scanners
Along-Track Scanners:
Along-track scanners, also known as "pushbroom" scanners, acquire image data in the
direction of the platform's movement.
These scanners use linear or two-dimensional arrays of detectors to capture a
continuous swath of data perpendicular to the platform's flight path.
As the platform moves forward, the detectors collect data continuously along the
track, producing an image composed of adjacent scan lines.
Examples of platforms equipped with along-track scanners include most satellite
sensors, where the satellite moves along its orbital path while scanning the Earth's
surface below.
Across-Track Scanners:
Across-track scanners, also known as "whiskbroom" scanners, acquire image data
across the platform's track, perpendicular to the direction of movement.
These scanners typically use a single or multiple detectors that scan across the swath
www.EnggTree.com
of interest as the platform moves forward.
The detectors collect data along individual scan lines across the swath, and the
platform may need to make multiple passes to cover the entire area of interest.
Examples of platforms equipped with across-track scanners include some airborne
sensors and ground-based systems.
Key Differences:
Spatial Coverage:
Along-track scanners cover a continuous swath perpendicular to the platform's path,
providing a wider spatial coverage in a single pass.
Across-track scanners cover a swath across the platform's path, requiring multiple
passes or scans to achieve the same spatial coverage as along-track scanners.
Image Formation:
Along-track scanners produce images composed of adjacent scan lines collected
continuously along the platform's track.
Across-track scanners produce images composed of scan lines collected across the
swath width, typically with gaps between adjacent lines that may need to be stitched
together.
Applications:
Along-track scanners are well-suited for satellite-based remote sensing applications,
where wide-area coverage is essential.
Across-track scanners are commonly used in airborne remote sensing applications,
where high spatial resolution and detailed imaging of smaller areas are required.
OPTICAL SENSORS
Principle: Optical sensors operate in the visible, near-infrared (NIR), and short-wave
infrared (SWIR) regions of the electromagnetic spectrum. They detect the reflected sunlight
www.EnggTree.com
from the Earth's surface. Different materials reflect and absorb light differently, allowing
optical sensors to discern various features on the ground.
Applications: Optical sensors are widely used in land cover classification, vegetation
monitoring, urban planning, agriculture, and environmental studies.
Calibration: Calibration of optical sensors involves correcting for radiometric and geometric
distortions in the imagery. Radiometric calibration ensures that pixel values represent
accurate reflectance values. Geometric calibration corrects for distortions such as terrain
relief, sensor tilt, and Earth curvature.
INFRARED SENSORS
Principle: Infrared sensors operate in the infrared portion of the electromagnetic spectrum,
beyond the visible range. They detect thermal radiation emitted by objects. Infrared sensors
can be further divided into near-infrared (NIR), short-wave infrared (SWIR), mid-wave
infrared (MWIR), and thermal infrared (TIR) sensors, each sensitive to different wavelengths.
www.EnggTree.com
Applications: Infrared sensors are used for applications such as vegetation health
assessment, soil moisture estimation, mineral identification, and heat mapping.
Calibration: Calibration of infrared sensors involves correcting for sensor noise,
atmospheric effects, and temperature variations. Radiometric calibration ensures that pixel
values accurately represent thermal radiance or temperature values.
THERMAL SENSORS
Principle: Thermal sensors operate specifically in the thermal infrared (TIR) region of the
electromagnetic spectrum, detecting the thermal radiation emitted by objects. They measure
the temperature of objects or surfaces based on their thermal emissions.
Applications: Thermal sensors are used for applications such as monitoring land surface
temperature, detecting heat anomalies, assessing thermal properties of buildings, and
identifying thermal signatures of vegetation stress or fires.
Calibration: Calibration of thermal sensors involves ensuring accurate temperature
measurements by calibrating the sensor's response to known temperature references.
Corrections are made for sensor drift, non-uniformity, and atmospheric effects.
MICROWAVE SENSORS
Principle: Microwave sensors operate in the microwave portion of the electromagnetic
spectrum. They emit microwave pulses and measure the backscattered radiation reflected
from the Earth's surface. Microwave sensors can penetrate clouds, vegetation, and soil,
making them useful for all-weather and day-night imaging.
www.EnggTree.com
Applications: Microwave sensors are used for applications such as terrain mapping, soil
moisture estimation, sea surface monitoring, ice detection, and agricultural monitoring.
Calibration: Calibration of microwave sensors involves correcting for system noise, antenna
patterns, and atmospheric effects. Radiometric calibration ensures accurate measurements of
backscattered microwave signals.
CALIBRATION OF SENSORS
Calibration of sensors in remote sensing is a critical process to ensure that the data collected
by the sensors are accurate, reliable, and consistent. Calibration involves a series of steps to
correct for various sources of error and uncertainty in the sensor measurements.
1. Radiometric Calibration
2. Geometric Calibration
3. Temporal Calibration
4. Cross-Track Calibration
5. In-Flight Calibration
Radiometric Calibration:
Purpose: Radiometric calibration ensures that the digital numbers (DN) or sensor readings
recorded by the sensor accurately represent the radiance or reflectance of the objects being
observed.
Steps:
Response Calibration: The sensor's response to known radiance or reflectance
standards is measured and used to establish a calibration curve relating sensor
readings to physical units (e.g., watts per square meter per steradian).
Correction for Systematic Errors: Corrections are applied to compensate for sensor-
specific errors such as dark current, sensor gain variations, non-linearity, and stray
light.
Atmospheric Correction: Corrections are made to account for atmospheric effects
such as scattering, absorption, and path radiance, which can affect the observed
radiance values.
Geometric Calibration:
Purpose: Geometric calibration ensures that the spatial relationships between objects in the
imagery are accurately represented, correcting for distortions introduced by the sensor and
platform.
Steps:
Sensor Model Calibration: Mathematical models are used to characterize the sensor's
geometric properties, including its focal length, lens distortion, and sensor orientation.
Ground Control Points (GCPs): GCPs with known coordinates on the Earth's surface
are identified in the imagery and used to estimate and correct geometric distortions
such as scale, rotation, and translation.
www.EnggTree.com
Orthorectification: Orthorectification is performed to project the image pixels onto a
map coordinate system, correcting for terrain relief and platform tilt effects.
Temporal Calibration:
Purpose: Temporal calibration ensures temporal consistency and continuity in the sensor
data over time, allowing for meaningful comparisons and analysis of multi-temporal datasets.
Steps:
Inter-Sensor Calibration: If data are collected from multiple sensors or platforms,
calibration procedures are performed to ensure consistency and compatibility between
datasets.
Radiometric Normalization: Datasets acquired at different times may exhibit
variations in radiometric properties due to changes in atmospheric conditions, solar
angle, and sensor characteristics. Radiometric normalization techniques are applied to
standardize the data to a common radiometric scale.
Cross-Track Calibration:
Purpose: Cross-track calibration ensures uniformity and consistency in image quality across
the entire swath width of the sensor.
Steps:
Detector Response Calibration: Detector response variations across the sensor's field
of view are measured and corrected to ensure uniform sensitivity and accuracy.
Stray Light Correction: Stray light from adjacent pixels or off-nadir angles can
contaminate the signal, leading to inaccuracies in the image. Corrections are applied
to minimize stray light effects.
In-Flight Calibration:
Purpose: In-flight calibration involves periodic measurements and adjustments made during
sensor operation to monitor and maintain sensor performance over time.
Steps:
Onboard Calibration Targets: Some sensors are equipped with onboard calibration
targets or instruments to monitor sensor stability and performance.
Regular Monitoring: Sensor parameters such as signal-to-noise ratio, dynamic range,
and stability are monitored and recorded during routine operations. Adjustments and
recalibrations are made as needed to ensure data quality.
www.EnggTree.com
Applications: High-resolution sensors are used for detailed mapping, urban planning,
infrastructure monitoring, disaster assessment, and other applications requiring fine spatial
detail.
Calibration: Calibration of high-resolution sensors involves ensuring accuracy in spatial and
radiometric measurements. Geometric calibration corrects for distortions in the imagery,
while radiometric calibration ensures accurate representation of pixel values.
www.EnggTree.com
LIDAR:
India has utilized LIDAR technology for various applications, including topographic
mapping, forest canopy analysis, urban planning, and infrastructure monitoring.
The Indian Space Research Organisation (ISRO) has developed LIDAR payloads for
some of its satellites, such as the Terrain Mapping Camera (TMC) onboard the
Chandrayaan-1 lunar mission and the Chandrayaan-2 mission, which included the
Terrain Mapping Camera-2 (TMC-2) for lunar surface topography mapping.
UAV (Unmanned Aerial Vehicle):
UAVs are increasingly being used for remote sensing applications in India,
particularly for high-resolution imaging, agricultural monitoring, disaster assessment,
and infrastructure inspection.
Indian institutions and organizations, including ISRO, the Indian Institute of Remote
Sensing (IIRS), and various research institutes and universities, have been involved in
the development and deployment of UAVs for remote sensing purposes.
UAV platforms equipped with multispectral or hyperspectral sensors are utilized for
crop monitoring, land cover mapping, forest health assessment, and environmental
monitoring.
Orbital Earth Observation Satellites:
www.EnggTree.com
www.EnggTree.com
www.EnggTree.com
www.EnggTree.com
LiDAR Data:
LiDAR sensors emit laser pulses and measure the time it takes for the
pulses to return, providing highly accurate elevation data. LiDAR is
essential for creating Digital Elevation Models (DEMs), assessing
terrain characteristics, and mapping landforms and vegetation
structure.
Level-0:
Raw data as received from the satellite without any processing.
Level-1:
Data processed to correct for sensor artifacts, geometric distortions,
and radiometric calibrations, making it usable for further analysis.
Level-2:
Further processed data with atmospheric correction applied to remove
atmospheric effects, enhancing the accuracy of quantitative analysis.
Level-3:
Data that is georeferenced and often aggregated over time or space to
create global or regional datasets suitable for thematic mapping and
trend analysis.
Level-4:
Derived products generated by combining satellite data with other
www.EnggTree.com
datasets or models to produce value-added products such as
vegetation indices, land cover maps, and climate variables.
Open Source Satellite Data Products:
1. Sentinel Data
2. Landsat Data
3. MODIS Data
4. ESA Earth Observation Data
5. NASA Earth Observing System Data
Sentinel Data:
The European Space Agency's Sentinel satellites offer free and open
access to a wealth of optical, radar, and thermal data through the
Copernicus Open Access Hub. Sentinel data is widely used for
environmental monitoring, disaster management, and scientific
research.
MODIS Data:
www.EnggTree.com
The Moderate Resolution Imaging Spectroradiometer (MODIS)
aboard NASA's Terra and Aqua satellites provides global coverage
with moderate spatial resolution and daily revisits. MODIS data is
used for monitoring vegetation dynamics, fire activity, sea surface
temperature, and atmospheric conditions.
(i) Tone
Ground objects of different colour reflect the incident radiation
differently depending upon the incident wave length, physical
and chemical constituents of the objects.
www.EnggTree.com
The imagery as recorded in remote sensing is in different shades
or tones. For example, ploughed and cultivated lands record
differently from fallow fields. Tone is expressed qualitatively as
light, medium and dark.
In SLAR imagery, for example, the shadows cast by non-return
of the microwaves appear darker than those parts where greater
reflection takes place. These parts appear of lighter tone.
Similarly in thermal imagery objects at higher temperature are
recorded of lighter tone compared to objects at lower
temperature, which appear of medium to darker tone. Similarly
top soil appears as of dark tone compared to soil containing
quartz sand.
The coniferous trees appear in lighter tone compared to broad
leave tree clumps.
Tone, therefore, refers to the colour or reflective brightness.
Tone along with texture and shadow (as described below) help
in Interpretation and hence is a very important key.
Even spring and seepage of water from the base of clay give a
kind of 'turbulant' texture.
(iii) Association
The relation of a particular feature to its surroundings is an
important key to interpretation.
Sometimes a single feature by itself may not be distinctive
enough to permit its identification.
For example, Sink holes appears as dark spots on an imagery
where the surface or immediate subsurface soil consists of lime
stones, Thus the appearance of sink holes is always associated
with surface lime stone formation.
An example is that of kettle holes which appear as depressions
on photos due to terminal moraine and glacial terrain.
An another example is that of dark-toned features associated
with a flood plain of a river, which can be interpreted as infilled
oxbow lakes.
(iv) Shape
Some ground features have typical shapes due to the structure or
topography. For examplewww.EnggTree.com
air fields and football stadium easily can be
interpreted because of their finite ground shapes and geometry
whereas volcanic covers, sand, river terraces, cliffs, gullies can be
identified because of their characteristics shape controlled by geology
and topography.
(v) Size
The size of an image also helps for its identification whether it is
relative or absolute.
Sometimes the measurements of height (as by using parallax
bar) also gives clues to the nature of the object.
For example, measurement of height of different clumps of trees
gives an idea of the different species, similarly the measurement
of dip and strike of rock formation help in identifying
sedimentary formation.
Similarly the measurements of width of roads help in
discriminating roads of different categories i.e,national, state,
local etc. Size of course, is dependent upon the scale of imagery.
(vi) Shadows
DIGITAL INTERPRETATION
Digital interpretation in remote sensing refers to the process of
analyzing and extracting meaningful information from digital imagery
acquired by satellite, aerial, or other remote sensing platforms. Unlike
traditional visual interpretation, which relies on human analysts to
interpret features in photographs or maps, digital interpretation
www.EnggTree.com
involves the use of computer-based techniques to automate or assist in
the analysis of remote sensing data.The steps carried out are
1. Image Processing
2. Feature Extraction
3. Change Detection
4. Quantitative Analysis
5. Integration with GIS and Modeling
6. Validation and Accuracy Assessment
www.EnggTree.com
PREPROCESSING
Preprocessing functions involve those operations that are
normally required prior to the maindata analysis and extraction
of information, and are generally grouped as radiometric or
geometric corrections.
Radiometric corrections include correcting the data for sensor
irregularities and unwanted sensor or atmospheric noise, and
converting the data so they accurately represent the reflected or
emitted radiation measured by the sensor.
Geometric corrections include correcting for geometric
distortions due to sensor-Earth geometry variations, and
conversion of the data to real world coordinates (e.g. latitude
and longitude) on the Earth's surface.
The objective of the second group of image processing functions
grouped under the term of image enhancement, is solely to
improve the appearance of the imagery to assist in visual
interpretation and analysis.
Examples of enhancement functionsinclude contrast stretching
to increase the tonal distinction between various features in a
www.EnggTree.com
Image classification
Supervised Classification
A supervised classification algorithm requires a training sample
for each class, that is, a collection of data points known to have
come from the class of interest. The classification is thus based
on how "close" a point to be classified is to each training
sample.
We shall not attempt to define the word "close" other than to say
that both geometric and statistical distance measures are used in
practical pattern recognition algorithms.
www.EnggTree.com
Figure Image Classification
The more the points, the more accurate the classification will be.
(ii)Select training data sets which are representative of the classes of
interest that show both typical average feature values and a typical
degree of variability. For each class, select several training areas on
the image, instead of just one. Each training area should contain a
moderately large number of pixels. Pick training areas from
seemingly heterogeneous or appearing regions. Pick training areas
that are widely and spatially dispersed, across the full image. For each
class, select the training areas which are uniformly distributed across
the image and with high density.
(iii)Check that selected areas have unimodel distributions
(histograms). A bimodal histogram suggests that pixels from two
different classes may be included in the training sample.
(iv)Select training sets (physically) using a computer-based
classification system:
Poorest method: Using coordinates of training points or training
regions directly.
Unsupervised Classification
www.EnggTree.com
Image enhancement
Low sensitivity of the detectors, weak signal of the objects
present on the earth surface, similar reflectance of different
objects and environmental conditions at the time of recording
are the major causes of low contrast of the image.
Another problem that complicates photographic display of
digital image is that the human eye is poor at discriminating the
slight radiometric or spectral differences that may characterize
the features. The main aim of digital enhancement is to amplify
these slight differences for better clarity of the image scene.
This means digital enhancement increases the separability
(contrast) between the interested classes or features.
The digital image enhancement may be defined as some
mathematical operations that are to be applied to digital remote
sensing input data to improve the visual appearance of an image
for better interpretability or subsequent digital analysis. Since
the image quality is a subjective measure varying from person to
Contrast Enhancement
The sensors mounted on board the aircraft and satellites have to
be capable of detecting upwelling radiance levels ranging from
low (oceans) or ice).
www.EnggTree.com
For any particular area that is being imaged it is unlikely that the
full dynamic range of the sensor will be used and the
corresponding image is dull and lacking in contrast or over
bright. In terms of the RGB model, the pixel values are clustered
in a narrow range of grey levels.
If this narrow range of gray levels could be altered so as to fit
the full range of grey levels, then the contrast between the dark
and light areas of the image would be improved while
maintaining the relative distribution of the gray levels. It is
indeed the manipulation of look-up table values.
detail and edges called high frequency filters, and filters that
pass low frequencies called low frequency filters.
Filtering is performed by using convolution windows. These
windows are called mask, template filter or kernel. In the
process of filtering, the window is moved over the input image
from extreme top left hand corner of the scene.
The discrete mathematical function transforming the original
input image digital number to a new digital value.
First it will move along the line. As soon as the line is complete,
it will restart
for the next line for covering the entire image.
The mask window may be rectangular (1 x 3, or 1 x 5 pixels)
size or square (3 x 3, 5 x 5 or 7 x 7 pixels size). Each pixel of
the window is given a weightage. For low pass filters all
thenweights in the window will be positive and for high pass
filter all the values may be negative or zero, but the central pixel
will be positive with higher weightage value.
In the case of high pass filter the algebraic sum of all the
weights in the window will be a zero.
www.EnggTree.com
Many types of mask windows of different sizes can be designed
by changing the size and varying weightage within the window.
The simplest form of mathematical function performed in
filtering operation is
neighbourhood averaging.
Another commonly used discrete function is to calculate the
sum
of the products given by the elements of the mask and the input
image pixel digital numbers of the central pixel digital number
in the moving window.