0% found this document useful (0 votes)
400 views118 pages

CEC348 Remote Sensing Lecture Notes 1

Uploaded by

lokeskece121
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
400 views118 pages

CEC348 Remote Sensing Lecture Notes 1

Uploaded by

lokeskece121
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 118

EnggTree.

com

COURSE MATERIAL
REMOTE SENSING
UNIT I & II
for
III YEAR / VI SEMESTER
ACADEMIC YEAR 2020-21

www.EnggTree.com
BY
Dr.V.SENTHILKUMAR
ASSOCIATE PROFESSOR
DEPARTMENT OF ELECTRONICS AND COMMUNICATION

Downloaded from EnggTree.com


EnggTree.com

www.EnggTree.com

Downloaded from EnggTree.com


EnggTree.com

UNIT I REMOTE SENSING


DEFINITION AND PROCESS OF REMOTE SENSING
INTRODUCTION
1) Now-a-days scientists, researchers, students, and even common people are showing great
interest for better understanding of our environment. By environment we mean the geographic
space of their study area and the events that take place there. In other words, we have come
to realize that geographic space along with the data describing it, is part of our everyday
world; almost every decision we take is influenced or dictated by some fact of geography.
2) Advancement in sophisticated space technology (which can provide large volume of
spatial data), along with declining costs of computer hardware and software (which can
handle these data) has made Remote Sensing and G.I.S. affordable to not only complex
environmental / spatial situation but also affordable to an increasingly wider audience.

REMOTE SENSING AND ITS COMPONENTS:

Remote sensing is the science of acquiring information about the Earth's

www.EnggTree.com
surface without actually being in contact with it. This is done by sensing and recording
reflected or emitted energy and processing, analyzing, and applying that information." In
much of remote sensing, the process involves an interaction beteen incident adiation
and the targets of interest. This is exemplified by the use of imaging systems where the
following seven elements are involved. However, that remote sense also involves the
sensing of emitted energy and the use of non-imaging sensors.

Downloaded from EnggTree.com


EnggTree.com

Fig 1.1- Components of Remote Sensing


Energy Source or Illumination (A) – the first requirement for remote sensing is to havean
energy source which illuminates or provides electromagnetic energy to the target of interest.
Radiation and the Atmosphre (B) – as the energy travels from its source to the target, it
www.EnggTree.com
will come in contact with and interact with the atmosphere it passes through. This interaction
may take place a second time as the energy travels from the target to the sensor.
Interaction with the Target (C) - once the energy makes its way to the target through the
atmosphere, it interacts with the target depending on the properties of both the target and the
radiation.
Recording of Energy by the Sensor (D) - after the energy has been scattered by,or
emitted from the target, we require a sensor (remote - not in contact with the target) to collect
and record the electromagnetic radiation.
Transmission, Reception, and Processing (E) - the energy recorded by the sensor has to be
transmitted, often in electronic form, to a receiving and processing station where the data are
processed into an image (hardcopy and/or digital).
Interpretation and Analysis (F) - the processed image is interpreted, visually and/or digitally
or electronically, to extract information about the target which was illuminated.
Application (G) - the final element of the remote sensing process is achieved whenwe
apply the information we have been able to extract from the imagery about the target in order
to better understand it, reveal some new information, or assist in solving a particular problem.

Downloaded from EnggTree.com


EnggTree.com

HISTRY OF REMOTE SENSING:


1839 - first photograph
1858 - first photo from a balloon1903 - first plane
1909 first photo from a plane1903-4 -B/W infrared film
WW I and WW II1960 – space

Passive/ Active Remote Sensing


Depending on the source of electromagnetic energy, remote sensing can be classified as
passive or active remote sensing.
In the case of passive remote sensing, source of energy is that naturally available such as the
Sun. Most of the remote sensing systems work in passive mode using solar energy as the
source of EMR. Solar energy reflected by the targets at specific wavelength bands are
recorded using sensors on board air-borne or space borne platforms. In order to ensure ample
signal strength received at the sensor, wavelength / energy bands capable of traversing
through the atmosphere, without significant loss through atmospheric interactions, are
generally used in remote sensing Any object which is at a temperature above 0o K (Kelvin)
emits some radiation, which is approximately proportional to the fourth power of the
www.EnggTree.com
temperature of the object. Thus the Earth also emits some radiation since its ambient
temperature is about 300o K. Passive sensors can also be used to measure the Earth‘s radiance
but they are not very popular as the energy content is very low.
In the case of active remote sensing, energy is generated and sent from the remote sensing
platform towards the targets. The energy reflected back from the targets are recorded using
sensors on board the remote sensing platform. Most of the microwave remote sensing is done
through active remote sensing.
As a simple analogy, passive remote sensing is similar to taking a picture with an ordinary
camera whereas active remote sensing is analogous to taking a picture with camera having
built-in flash
What is Sensor Platform?
Platform is a stage where sensor or camera is mounted to acquire information about a target
under investigation.
According to Lillesand and Kiefer (2000), a platform is a vehicle, from which a sensor can be
operated.
For remote sensing applications, sensors should be mounted on suitable stable platforms
As the platform height increases the spatial resolution and observational area increases.

Downloaded from EnggTree.com


EnggTree.com

The types or characteristics of platform depend on the type of sensor to be attached and its
application.
Type of Platforms:
Platforms can vary from stepladders to satellites.
There are different types of platforms and based on its altitude above earth surface.
Three types of platforms are used to mount the remote sensors
1. Ground based Platform
2. Air - borne Platform, and
3. Space-borne Platform
Ground based Platforms:
 Ground based platforms are used to record detailed information about the objects or
features of the earth’s surface
 These are developed for the scientific understanding on the signal-object and signal-
sensor interactions.
 It includes both the laboratory and field study, used for both in designing sensors and
identification and characterization of land features.
 Example: Handheld platform, cherry picker, towers, portable masts and vehicles etc.

www.EnggTree.com
Portable handheld photographic cameras and spectroradiometers are largely used in
laboratory and field experiments as a reference data and ground truth verification.
 Crane, Ground based platform (cherry Picker Platform extend up to approx. 15m. )
Air- borne/ based Platforms:
 Airborne platforms were the sole non-ground-based platforms for early remote
sensing work.
 Aircraft remote sensing system may also be referred to as sub-orbital or airborne, or
aerial remote sensing system
 At present, airplanes are the most common airborne platform.
 observation platforms include balloons, drones (short sky spy) and high altitude
sounding rockets. Helicopters are occasionally used.
Balloons:
 Balloons are used for remote sensing observation (aerial photography) and nature
conservation studies.
 The first aerial images were acquired with a camera carried aloft by a balloon in 1859.
 Balloon floats at a constant height of about 30 km.

Downloaded from EnggTree.com


EnggTree.com

 Balloons as platforms are not very expensive like aircrafts. They have a great variety
of shapes, sizes and performance capabilities.
 The balloons have low acceleration, require no power and exhibit low vibrations.
 It consists of a rigid circular base plate for supporting the entire sensor system which
is protected by an insulating and shock proof light casing.
 The payload used for Indian balloon experiment of three Hasselblad cameras with
different film filter combinations, to provide PAN, infra red black and white and infra
red false color images.
 Flight altitude being high compared to normal aircraft height used for aerial survey,
balloon imagery gives larger synoptic views.
 The balloon is governed by the wind at the floating altitude
 There are three main types of balloon systems, viz. free balloons, Tethered balloons
and Powered Balloons.
 Free balloons can reach almost top of the atmosphere; hence, they can provide a
platform at intermediate altitude between those of aircraft and spacecraft (shown in
fig.)
 Have altitude range of 22-40 km and can be used to a limited extent as a platform.
Drone: www.EnggTree.com
 Drone is a miniature remotely piloted aircraft.
 It is designed to fulfill requirements for a low cost platform, with long endurance,
moderate payload capacity and capability to operate without a runway or small
runway.
 Drone includes equipment of photography, infrared detection, radar observation and
TV surveillance. It uses satellite communication link.
 An onboard computer controls the payload and stores data from different sensors and
instruments.
Aircraft Platform:
 Aircraft are used to collect very detailed images.
 Helicopters can be for pinpoint locations but it vibrates and lacks stability.
 Special aircraft with cameras and sensors on vibration less platforms are traditionally
used to acquire aerial photographs and images of land surface features.
 While low altitude aerial photography results in large scale images providing detailed
information on the terrain, the high altitude smaller scale images offer advantage to
cover a larger study area with low spatial resolution
 Aircraft platforms offer an economical method of remote sensing data collection for

Downloaded from EnggTree.com


EnggTree.com

small to large study areas with cameras, electronic imagers, across- track and along-
track scanners, and radar and microwave scanners.
 Low Altitude Aircraft: It is most widely used and generally operates below 30,000 ft.
 It is suitable for obtaining image data for small areas having large scale
 High altitude aircraft: It includes jet aircraft with good rate of climb, maximum speed,
and high operating ceiling. It acquires imagery for large areas
Rockets as Platforms:
 High altitude sounding rocket platforms are useful in assessing the reliability of the
remote sensing techniques as regards their dependence on the distance from the target
is concerned.
 Balloons have a maximum altitude of approximately 37 km, while satellites cannot
orbit below 120 km. High altitude sounding rockets can be used to a moderate altitude
above terrain
 Synoptic imagery can be obtained from rockets for areas of some 500,000 square km.
Space-borne/ based Platforms:
 In space- borne remote sensing, sensors are mounted on-board a spacecraft (space
shuttle or satellite) orbiting the earth.
 www.EnggTree.com
Space-borne or satellite platform are onetime cost effected but relatively lower cost
per unit area of coverage, can acquire imagery of entire earth without taking
permission.
 Space-borne imaging ranges from altitude 250 km to 36000 km.
 Space-borne remote sensing provides the following advantages:
Large area coverage;
 Frequent and repetitive coverage of an area of interest;
 Quantitative measurement of ground features using radiometrically calibrated sensors;
 Semi-automated computerised processing and analysis;
 Relatively lower cost per unit area of coverage.
Spacecraft as Platform:
 Remote sensing is also conducted from the space shuttle or artificial satellites.
Artificial satellites are manmade objects, which revolve around another object.
 Satellite can cover much more land space than planes and can monitor areas on a
regular basis.
 Later on with LANDSAT and SPOT satellites program, space photography received a
higher impetus

Downloaded from EnggTree.com


EnggTree.com

ELECTROMAGNETIC SPECTRUM
The first requirement for remote sensing is to have an energy source to illuminate the target
(unless thesensed energy is being emitted by the target). This energy is in the form of
electromagnetic radiation. All electromagnetic radiation has fundamental properties and
behaves in predictable ways according to the basicsof wave theory.

Electromagnetic radiation consists of an electrical field (E) which varies inmagnitude in a


direction perpendicular to the direction in which the radiation is traveling, and a magnetic
field
(M) oriented at right angles to the electrical field. Both these fields travel at the speed of
light (c). Two characteristics of electromagnetic radiation are particularly important to
understand remote sensing. These are the wavelength and frequency.

Electromagnetic radiation (EMR) as an electromagnetic wave that travels throughspace at


the speed of light C which is 3x108 meters per second.
Theoretical model of random media including the anisotropic effects, random distribution
discrete scatters, rough surface effects, have been studied for remote sensing with
www.EnggTree.com
electromagnetic waves.

Wavelength

The wavelength is the length of one wave cycle, which can be measured as the distance
between successive wave crests. Wavelength is usually represented by the Greek letter
lambda (λ). Wavelength is measured in metres (m) or some factor of metres such as
nanometres (nm, 10- 9 metres), micrometres (μm, 10-6 metres) (μm, 10-6 metres) or
centimetres (cm, 10-2 metres). Frequency refers to the number of cycles of a wave passing a
fixed point per unit of time. Frequency is normally measured in hertz (Hz), equivalent to one
cycle per second, and various multiples of hertz.

Wavelength and frequency are related by the following formula:


Downloaded from EnggTree.com
EnggTree.com

Therefore, the two are inversely related to each other. The shorter the wavelength, the higher
the frequency. The longer the wavelength, the lower the frequency. Understanding the
characteristics of electromagnetic radiation in terms of their wavelength and frequency is
crucial to understanding the information to be extracted from remote sensing data.
The electromagnetic spectrum ranges from the shorter wavelengths (including gamma and x-
rays) to the longer wavelengths (including microwaves and broadcast radio waves). There are
several regions of the electromagnetic spectrum which are useful for remote sensing.

www.EnggTree.com

Downloaded from EnggTree.com


EnggTree.com

Fig 3 – Electromagnetic Spectrum


WAVELENGTH REGIONS IMPORTANT TO REMOTE SENSING:

www.EnggTree.com
Ultraviolet or UV

For the most purposes ultraviolet or UV of the spectrum shortest wavelengths are practical for
remote sensing. This wavelength beyond the violet portion of the visible wavelengths hence it
name. Some earth surface materials rocks and materials are emit visible radiation when
illuminated by UV radiation.

Visible Spectrum
The light which our eyes - our "remote sensors" - can detect is part of the visible spectrum. It
is important to recognize how small the visible portion is relative to the rest of the spectrum.
There is a lot of radiation around us which is "invisible" to our eyes, but can be detected by
other remote sensing instruments and used to our advantage. The visible wavelengths cover a
range from approximately 0.4 to 0.7 μm. The longest visible wavelength is red and the
shortest is violet. Common wavelengths of what we perceive as particular colours from the
visible portion of the spectrum are listed below. It is important to note that this is the only
portion of the spectrum we can associate with the concept of colors.

Downloaded from EnggTree.com


EnggTree.com

Violet: 0.4 -0.446 μm

Blue: 0.446 -0.500 μm

Green: 0.500 -0.578 μm

Yellow: 0.578 -0.592 μm

Orange: 0.592 -0.620 μm

Red: 0.620 -0.7 μm

Blue, green, and red are the primary colours or wavelengths of the visible spectrum. They
are defined as such because no single primary colour can be created from the other two, but
all other colours can be formed by combining blue, green, and red in various proportions.
Although we see sunlight as a uniform or homogeneous colour, it is actually composed of
www.EnggTree.com
various wavelengths of radiation in primarily the ultraviolet, visible and infrared portions of
the spectrum. The visible portion of this radiation can be shown in its component colours
when sunlight is passed through a prism, which bends the light in differing amounts
according to wavelength.

Infrared (IR)

The next portion of the spectrum of interest is the infrared (IR) region which covers the
wavelength range from approximately 0.7 μm to 100 μm more than 100 times as wide as the
visible portion. The infrared can be divided into 3 categories based on their radiation
properties-the reflected near- IR middle IR and thermal IR.
The reflected near IR covers wavelengths from approximately 0.7 μm to 1.3 μm is
commonly used to expose black and white and color-infrared sensitive film.
The middle-infrared region includes energy with a wavelength of 1.3 to 3.0 μm.
The thermal IR region is quite different than the visible and reflected IR portions, as this
energy is essentially the radiation that is emitted from the Earth's surface in the form of heat.
The thermal IR covers wavelengths from approximately 3.0 μm to 100 μm.

Downloaded from EnggTree.com


EnggTree.com

Microwave

This wavelength (or frequency) interval in the electromagnetic spectrum is commonly


referred to as a band, channel or region.The major subdivision
The portion of the spectrum of more recent interest to remote sensing is the microwave
region from about 1 mm to 1 m. This covers the longest wavelengths used for remote sensing.
The shorter wavelengths have properties similar to the thermal infrared region while the
longer wavelengths approach the wavelengths used for radiobroadcasts.

www.EnggTree.com

Downloaded from EnggTree.com


EnggTree.com

WAVE THEORY AND PARRTICAL THEORY


Light can exhibit both a wave theory, and a particle theory at the same time. Much of the
time, light behaves like a wave. Light waves are also called electromagnetic waves because
they are made up of both electric (E) and magnetic (H) fields. Electromagnetic fields oscillate
perpendicular to the direction of wave travel, and perpendicular to each other. Light waves
are known as transverse waves as they oscillate in the direction traverse to the direction of
wave travel.

Fig 1.4 – Electromagnetic propagation


www.EnggTree.com
Waves have two important characteristics - wavelength and frequency.
The sine wave is the fundamental waveform in nature. When dealing with light waves, we
refer to the sine wave. The period (T) of the waveform is one full 0 to 360 degree sweep. The
relationship of frequency and the period is given by the equation:
f = 1 / TT = 1 / f
The waveforms are always in the time domain and go on for infinity.
The speed of a wave can be found by multiplying the two units together. The wave's speed is
measured in units of length (distance) per second:
Wavelength x Frequency = Speed
As proposed by Einstein, light is composed of photons, a very small packets of energy. The
reason that photons are able to travel at light speeds is due to the fact that they have no mass
and therefore, Einstein's infamous equation - E=MC2 cannot be used. Another formula
devised by Planck, is used to describe the relation between photon energy and frequency -
Planck's
Constant (h) - 6.63x10-34 Joule-Second. E = hf(or)E = hc

E is the photonic energy in Joules, h is Planks constant and f is the frequency in Hz.

Downloaded from EnggTree.com


EnggTree.com

PARTICAL THEORY
The basic idea of quantum theory is that radiant energy is transmitted in indivisible packets
whose energy is given in integral parts, of size hv, where h is Planck's constant =
6.6252 x 10-34 J - s, and v is the frequency of the radiation. These are called quanta or
photons.

The dilemma of the simultaneous wave and particle waves of electromagnetic energy may be
conceptually resolved by considering that energy is not supplied continuously throughout a
wave, but rather that it is carried by photons. The classical wave theory does notgive the
intensity of energy at a point in space, but gives the probability of finding a photon atthat
point. Thus the classical concept of a wave yields to the idea that a wave simply describes the
probability path for the motion of the individual photons.
The particular importance of the quantum approach for remote sensing is thatit provides the
concept of discrete energy levels in materials. The values and arrangement of these levels are
different for different materials. Information about a given material is thus available in
electromagnetic radiation as a consequence of transitions between these energy levels. A
transition to a higher energy level is caused by the absorption of energy, or from a higher to a
www.EnggTree.com
lower energy level is caused by the' emission of energy. The amounts of energy either
absorbed or emitted correspond precisely to the energy difference between the two levels
involved in the transition. Because the energy levels are different for each material, the
amount of energy a particular substance can absorb or emit is different for that material from
any other materials. Consequently, the position and intensities of the bands in the spectrum of
a given material are characteristic to that material.

STEFAN–BOLTZMANN LAW
Stefan–Boltzmann law, also known as Stefan's law, describes the power radiated from a
black body in terms of its temperature. Specifically, the Stefan–Boltzmann law states that the
total energy radiated per unit surface area of a black body across all wavelengths per
unit time (also known as the black-body radiant exitance or emissive power), , is
directly proportional to the fourth power of the black body's thermodynamic temperature T:

Downloaded from EnggTree.com


EnggTree.com

E LAW
WIEN'S DISPLACEMENT
Wien's displacement law states that the black body radiation curve for different temperatures
peaks at a wavelength inversely proportional to the temperature. The shift of
that peak of the Planck radiation
u law which describes the spectral brightness of black
body radiation as a function of wavelength at any given temperature.
However it had discovered by Wilhelm Wien several years before Max
been

www.EnggTree.com

Downloaded from EnggTree.com


EnggTree.com

Planck developed that more general equation, and describes the entire shift of the spectrum of
black body radiation toward shorter wavelengths as temperature increases.
Formally, Wien's displacement law states that the spectral radiance of black body radiation
per unit wavelength, peaks at the wavelength λmax given by:

where T is the absolute temperature in degrees kelvin. b is a constant of proportionality called


Wien's displacement constant, equal to 2.8977721(26)×10−3 mK.[1], or more conveniently to
obtain wavelength in microns, b≈2900 μm K.
If one is considering the peak of black body emission per unit frequency r per proportional
bandwidth, one must use a different proportionality constant. However the form of the law
remains the same: the peak wavelength is inversely proportional to temperature (or the peak
frequency is directly proportional to temperature).
Wien's displacement law may be referred to as "Wien's law", a term which is also used for the
Wien approximation.
Blackbody Radiation
A blackbody is a hypothetical, ideal radiator. It absorbs and reemits the entire energy incident
upon it. www.EnggTree.com
Total energy emitted by a black body varies with temperature as given in Eq. 4. The total energy
is distributed over different wavelengths, which is called the spectral distribution or spectral
curve here. Area under the spectral curve gives the total radiant exitance M.
In addition to the total energy, the spectral distribution also varies with the temperature. Fig. 4
shows the spectral distribution of the energy radiated from black bodies at different
temperatures. The figure represents the Stefan-Boltzmann‘s law graphically. As the temperature
increases, area under the curve, and hence the total radiant exitance increases.

Downloaded from EnggTree.com


EnggTree.com

www.EnggTree.com

Figure5. Spectral energy distribution of blackbody at various temperatures

From Fig. 4, it can be observed that the peak of the radiant exitance varies with wavelength. As
the temperature increases, the peak shifts towards the left. This is explained by the Wien‘s
displacement law. It states that the dominant wavelength at which a black body radiates λm is
Inversely proportional to the absolute temperature of the black body (in K) and is represented as
given below.

Downloaded from EnggTree.com


EnggTree.com

UNIT II EMR INTERACTION WITH ATMOSPHERE AND EARTH MATERIALS

ENERGY INTERACTIONS WITH THE ATMOSPHERE


Before radiation used for remote sensing reaches the Earth's surface it has to travel through
some distance of the Earth's atmosphere. Particles and gases in the atmosphere can affect the
incoming light and radiation. These effects are caused by the mechanisms of scattering and
absorption .

www.EnggTree.com
Fig 2.1 Energy Interaction with Atmosphere

SCATTERING
Scattering occurs when particles or large gas

Downloaded from EnggTree.com


EnggTree.com

molecules present in the atmosphere interact with and cause the electromagnetic radiation
to be redirected from its original path. How much scattering takes place dependson several
factors including the wavelength of the radiation, the abundance of particles or gases,
and the distance the radiation travels through the atmosphere. Thereare three (3) types of
scattering which take place.
RAYLEIGH SCATTERING
Rayleigh scattering occurs when particles are very small compared to the wavelength of the
radiation. These could bearticles such as small specks of dust or nitrogenand oxygen
molecules. Rayleigh scattering causes shorter wavelengths of energy to be scattered much
more than longer wavelengths. Rayleigh scattering is the dominant scattering mechanism in
the upper atmosphere. The fact that the sky appears "blue" during theday is because of this
phenomenon. As sunlight passes through the atmosphere, the shorter wavelengths (i.e. blue)
of the visible spectrum are scattered more than the other (longer) visible wavelengths. At
sunrise and sunset the light has to travel farther through the atmosphere than at midday and
the scattering of the shorter wavelengths is more complete; this leaves a greater proportion of
the longer wavelengths to penetrate the atmosphere.

www.EnggTree.com

Fig2.2 . Raleigh Scattering


ABORPTION

Absorption is the other main mechanism at work when electromagnetic radiation interacts
with the atmosphere. In contrast to scattering, this phenomenon causes molecules in the
atmosphere to absorb energy at various wavelengths. Ozone, carbon dioxide, and water vapor
are the three main atmospheric constituents which absorb radiation. Ozone serves to absorb
the harmful (to

Downloaded from EnggTree.com


EnggTree.com

most living things) ultraviolet radiation for the sun. Without this protective layer in the
atmosphere our skin would burn when exposed to sunlight. Carbon dioxide referred to as a
greenhouse gas. This is because it tends to absorb radiation strongly in the far infrared portion
of the spectrum - that area associated with thermal heating - which serves to trap this heat
inside the atmosphere. Water vapour in the atmosphere absorbs much of the incoming
longwave infrared and shortwave microwave radiation (between 22μm and 1m). The presence
of water vapour in the lower atmosphere varies greatly from location to location and at
different times of the year. For example, the air mass above a desert would have very little
water vapour to absorb energy, while the tropics would have high concentrations of water
vapour (i.e. high humidity).

MIE SCATTERING
Mie scattering occurs when the particles are just about the same size as the wavelength of the
radiation. Dust, pollen, smoke and water vapour are common causes of Mie scattering which
tends to affect longer wavelengths than those affected by Rayleigh scattering. Mie scattering
occurs mostly in the lower portions of the atmosphere where larger particles are more
abundant, and dominates when cloud conditions are overcast.
www.EnggTree.com
The final scattering mechanism of importance is called nonselective scattering. This occurs
when the particles are much larger than the wavelength of the radiation.

Water droplets and large dust particles can cause this type of scattering. Nonselective
scattering gets its name from the fact that all wavelengths are scattered about equally. This
type of scattering causes fog and clouds to appear white to our eyes because blue, green, and
red light are all scattered in approximately equal quantities (blue+green+red light = white
light).

ATMOSPHERIC WINDOWS
While EMR is transmitted from the sun to the surface of the earth, it passes through the
atmosphere. Here, electromagnetic radiation is scattered and absorbed by gases and dust
particles. Besides the major atmospheric gaseous components like molecular nitrogen and
oxygen, other constituents like water vapour, methane, hydrogen, helium and nitrogen
compounds play important role in modifying electro magnetic radiation. This affects image
quality. Regions of the electromagnetic spectrum in which the atmosphere is transparent are
called atmospheric windows. In other words, certain spectral regions of the electromagnetic

Downloaded from EnggTree.com


EnggTree.com

radiation pass through the atmosphere without much attenuation are called atmospheric
windows. The atmosphere is practically transparent in the visible region of the
electromagnetic spectrum and therefore, many of the satellite based remote sensing sensors
are designed to collect data in this region. Some of the commonly used atmospheric windows
are shown in the figure.
Figure . They are: 0.38-0.72 microns (visible), 0.72-3.00 microns (near infra-red and middle
infra-red), and 8.00-14.00 microns (thermal infra-red).
Transmission100%UVVisibleInfraredEnergy Blocked0.3 Wavelength (microns)1101001 mm

SPECTRAL SIGNATURE www.EnggTree.com


CONCEPTS-TYPICAL SPECTRAL REFLECTANCE
CHARACTRISTICS OF WATER, VEGETATION AND SOIL:
A basic assumption made in remote sensing is that a specific target has an individual and
characteristic manner of interacting with incident radiation. The manner of interaction is
described by the spectral response of the target. The spectral reflectance curves describe the
spectral response of a target in a particular wavelength region of electromagnetic spectrum,
which, in turn depends upon certain factors, namely, orientation of the sun (solar azimuth),
the height of the Sun in the sky (solar elevation angle), the direction in which the sensor is
pointing relative to nadir (the look angle) and nature of the target, that is, state of health of
vegetation.

Downloaded from EnggTree.com


EnggTree.com

Fig 2.3 Spectral reflectance Curve


www.EnggTree.com
Every object on the surface of the earth has its unique spectral reflectance. Fig. 2.3 shows the
average spectral reflectance curves for three typical earth's features: vegetation, soil and
water. The spectral reflectance curves for vigorous vegetation manifests the "Peak- and-
valley" configuration. The valleys in the visible portion of the spectrum are indicative of
pigments in plant leaves. Dips in reflectance (Fig. 2.3) that can be seen at wavelengths
of
0.65 .µm, 1.4 µm and 1.9 µm are attributable to absorption of water by leaves. The soil curve
shows a more regular variation of reflectance. Factors that evidently affect soil reflectance
are moisture content, soil texture, surface roughness, and presence of organic matter. The
term spectral signature can also be used for spectral reflectance curves. Spectral signature is a
set of characteristics by which a material or an object may be identified on any satellite image
or photograph within the given range of wavelengths. Some time &, spectral signatures are
used to denote the spectral response of a target.

The characteristic spectral reflectance curve Fig2.3 for water shows that from about 0.5µm, a
reduction in reflectance with increasing wavelength, so that in the near infrared range, the
reflectance of deep, clear water is virtually a zero (Mather, 1987). However, the spectral
reflectance of water is significantly affected by the presence of dissolved and suspended

Downloaded from EnggTree.com


EnggTree.com

organic and inorganic material and by the depth of the water body. Fig. 1.8 shows the spectral
reflectance curves for visible and near-infrared wavelengths at the surface and at 20 m depth.
Suspended solids
in water scatter the down welling radiation, the degree of scatter being proportional to the
concentration and the color of the sediment. Experimental studies in the field and in the
laboratory as well as experience with multispectral remote sensing have shown that the
specific targets are characterized by an individual spectral response. Indeed, the successful
development of remote sensing of environment over the past decade bears witness to its
validity. In the remaining part of this section, typical and representative spectral reflectance
curves for characteristic types of the surface materials are considered. Imagine a beach on a
beautiful tropical island. of electromagnetic radiation with the top layer of sand grains on the
beach. When an incident ray of electromagnetic radiation strikes an air/grain interface, part of
the ray is reflected and part of it is transmitted into the sand grain. The solid lines in the
figure represent the incident rays, and dashed lines 1, 2, and 3 represent rays reflected from
the surface but have never penetrated a sand grain. The latter are called specular rays by
Vincent and Hunt (1968), and surface-scattered rays by Salisbury and Wald
(1992); these rays result from first-surface reflection from all grains encountered. For a given
www.EnggTree.com
reflecting surface, all specular rays reflected in the same direction, such that the angle of
reflection (the angle between the reflected rays and the normal, or perpendicular to the
reflecting surface) equals the angle of incidence (the angle between the incident rays and the
surface normal). The measure of how much electromagnetic radiation is reflected off a
surface is called its reflectance, which is a number between 0 and 1.0. A measure of 1.0
means the 100% of the incident radiation is reflected off the surface, and a measure of 0
means that 0% is reflected.
ENERGY INTERACTIONS WITH EARTH SURFACE FEATURES
Energy incident on the Earth‘s surface is absorbed, transmitted or reflected depending on the
wavelength and characteristics of the surface features (such as barren soil, vegetation, water
body). Interaction of the electromagnetic radiation with the surface features is dependent on
the characteristics of the incident radiation and the feature characteristics. After interaction
with the surface features, energy that is reflected or re-emitted from the features is recorded at
the sensors and are analysed to identify the target features, interpret the distance of the object,
and /or its characteristics.
This lecture explains the interaction of the electromagnetic energy with the Earth‘s surface
features.

Downloaded from EnggTree.com


EnggTree.com

Energy Interactions
The incident electromagnetic energy may interact with the earth surface features in three
possible ways: Reflection, Absorption and Transmission. These three interactions are
Reflection Absorption Earth Transmission
Incident radiation
Reflection occurs when radiation is redirected after hitting the target. According to the law of
reflection, the angle of incidence is equal to the angle of reflection the EM energy which is
absorbed by the Earth‘s surface is available for emission and as thermal radiation at longer
wavelengths.
Transmission occurs when radiation is allowed to pass through the target. Depending upon
the characteristics of the medium, during the transmission velocity and wavelength of the
radiation changes, whereas the frequency remains same. The transmitted energy may further
get scattered and / or absorbed in the medium.
These three processes are not mutually exclusive. Energy incident on a surface may be
partially reflected, absorbed or transmitted. Which process takes place on a surface depends
on the following factors:
 Wavelength of the radiation
www.EnggTree.com
 Angle at which the radiation intersects the surface
 Composition and physical properties of the surface

The relationship between reflection, absorption and transmission can be expressed through
the principle of conservation of energy. Let EI denotes the incident energy, ER denotes the
reflected energy, EA denotes the absorbed energy and ET denotes the transmitted energy.
Then the principle of conservation of energy (as a function of wavelength λ) can be expressed
as
EI (λ) = ER (λ) + EA (λ) + ET (λ) (1)
Since most remote sensing systems use reflected energy, the energy balance relationship can
be better expressed in the form
ER (λ) = EI (λ) - EA (λ) - ET (λ) (2)
The reflected energy is equal to the total energy incident on any given feature reduced by the
energy absorbed or transmitted by that feature.
Reflection
Reflection is the process in which the incident energy is redirected in such a way that the angle
of incidence is equal to the angle of reflection. The reflected radiation leaves the surface at the

Downloaded from EnggTree.com


EnggTree.com

same angle as it approached.


Scattering is a special type of reflection wherein the incident energy is diffused in many
directions and is sometimes called diffuse reflection.

When electromagnetic energy is incident on the surface, it may get reflected or scattered
depending upon the roughness of the surface relative to the wavelength of the incident
energy. If the roughness of the surface is less than the wavelength of the radiation or the ratio
of roughness to wavelength is less than 1, the radiation is reflected. When the ratio is more
than 1 or if the roughness is more than the wavelength, the radiation is scattered.
Fraction of energy that is reflected / scattered is unique for each material. This will aid in
distinguishing different features on an image
A feature class denotes distinguishing primitive characteristic or attribute of an image that
have been classified to represent a particular land cover type/spectral signature. Within one
feature class, the proportion of energy reflected, emitted or absorbed depends on the
wavelength. Hence, in spectral range two features may be indistinguishable; but their
reflectance properties may be different in another spectral band. In multi-spectral remote
sensing, multiple sensors are used to record the reflectance from the surface features at
www.EnggTree.com
different wavelength bands and hence to differentiate the target features.
Variations in the spectral reflectance within the visible spectrum give the colour effect to the
features.
For example, blue colour is the result of more reflection of blue light. An object appears
as ―green‖ when it reflects highly in the green portion of the visible spectrum. Leaves appear
green since its chlorophyll pigment absorbs radiation in the red and blue wavelengths but
reflects green wavelengths. Similarly, water looks blue-green or blue or green if viewed
through visible band because it reflects the shorter wavelengths and absorbs the longer
wavelengths in the visible band. Water also absorbs the near infrared wavelengths and hence
appears darker when viewed through red or near infrared wavelengths. Human eye uses
reflected energy variations in the visible spectrum to discriminate between various features.
For example, shows a part of the Krishna River Basin as seen in different bands of the
Landsat ETM+ imagery. As the concepts of false colour composite (FCC) have been covered
in module 4, readers are advised to refer to the material in module 4 for better understanding
of the colour composite imageries as shown in Fig. 5. Reflectance of surface features such as
water, vegetation and fallow lands are
different in different wavelength bands. A combination of more than one spectral band helps

Downloaded from EnggTree.com


EnggTree.com

to attain better differentiation of these features.

Diffuse and Specular Reflection


Energy reflection from a surface depends on the wavelength of the radiation, angle of incidence
and the composition and physical properties of the surface.
Roughness of the target surface controls how the energy is reflected by the surface. Based on
the roughness of the surface, reflection occurs in mainly two ways.

Specular reflection: It occurs when the surface is smooth and flat. A mirror-like or smooth
reflection is obtained where complete or nearly complete incident energy is reflected in one
direction. The angle of reflection is equal to the angle of incidence. Reflection from the
surface is the maximum along the angle of reflection, whereas in any other direction it is
negligible.
Diffuse (Lambertian) reflection: It occurs when the surface is rough. The energy is reflected
uniformly in all directions. Since all the wavelengths are reflected uniformly in all directions,
diffuse reflection contains spectral information on the "color" of the reflecting surface. Hence,
in remote sensing diffuse reflectance properties of terrain features are measured. Since the
www.EnggTree.com
reflection is uniform in all direction, sensors located at any direction record the same
reflectance and hence it is easy to differentiate the features.
Based on the nature of reflection, surface features can be classified as specular
reflectors, Lambertian reflectors. An ideal specular reflector completely reflects the incident
energy with angle of reflection equal to the angle incidence. An ideal Lambertian or diffuse
reflector scatters all the incident energy equally in all the directions.
The specular or diffusive characteristic of any surface is determined by the roughness of the
surface in comparison to the wavelength of the incoming radiation. If the wavelengths of the
incident energy are much smaller than the surface variations or the particle sizes, diffuse
reflection will dominate. For example, in the relatively long wavelength radio range, rocky
terrain may appear smooth to incident energy. In the visible portion of the spectrum, even a
material such as fine sand appears rough while it appears fairly smooth to long wavelength
microwaves.
Most surface features of the earth are neither perfectly specular nor perfectly diffuse
reflectors. In near specular reflection, though the reflection is the maximum along the angle
of reflection, a fraction of the energy also gets reflected in some other angles as well. In near
Lambertian reflector, the reflection is not perfectly uniform in all the directions. The
characteristics of different types of reflectors are

Downloaded from EnggTree.com


EnggTree.com

Near diffusive Near specular Ideal diffusive Ideal specular Angle of reflection Angle of
incidence

Lambertian reflectors are considered ideal for remote sensing. The reflection from an
ideal Lambertian surface will be the same irrespective of the location of the sensor. On the
other hand, in case of an ideal specular reflector, maximum brightness will be obtained only
at one location and for the other locations dark tones will be obtained from the same target.
This variation in the spectral signature for the same feature affects the interpretation of the
remote sensing data.
Most natural surfaces observed using remote sensing are approximately Lambertian at
visible and IR wavelengths. However, water provides specular reflection. Water generally
gives a dark tone in the image. However due to the specular reflection, it gives a pale tone
when the sensor is located in the direction of the reflected energy.
Spectral Reflectance of Earth Surface
Vegetation

In general, healthy vegetation is a very good absorber of electromagnetic energy in the visible
www.EnggTree.com
region. Chlorophyll strongly absorbs light at wavelengths around 0.45 (blue) and 0.67 µm
(red) and reflects strongly in green light, therefore our eyes perceive healthy vegetation as
green. Healthy plants have a high reflectance in the near-infrared between 0.7 and 1.3 µm.
This is primarily due to healthy internal structure of plant leaves. As this internal structure
varies amongst different plant species, the near infrared wavelengths can be used to
discriminate between different plant species.

Water

In its liquid state, water has relatively low reflectance, with clear water having the greatest
reflectance in the blue portion of the visible part of the spectrum. Water has high absorption
and virtually no reflectance in near infrared wavelengths range and beyond. Turbid water has
a higher reflectance in the visible region than clear water. This is also true for waters
containing high chlorophyll concentrations.

Ice and Snow

Ice and snow generally have high reflectance across all visible wavelengths, hence their
bright white appearance. Reflectance decreases in the near infrared portion and there is
very low

Downloaded from EnggTree.com


EnggTree.com

reflectance in the SWIR (shortwave infrared). The low reflection of ice and snow in the
SWIR is related to their microscopic liquid water content. Reflectance differs for snow and
ice depending on the actual composition of the material including impurities and grain size.

Soil

Bare soil generally has an increasing reflectance, with greater reflectance in near-infrared and
shortwave infrared. Some of the factors affecting soil reflectance are:

 Moisture content

 Soil texture (proportion of sand, silt, and clay)

 Surface roughness

 Presence of iron oxide

 Organic matter content

www.EnggTree.com

Downloaded from EnggTree.com


EnggTree.com

www.EnggTree.com

Downloaded from EnggTree.com


EnggTree.com

UNIT -II EMR INTERACTION WITH ATMOSPHERE AND EARTH MATERIAL

Standard atmospheric profile:


The standard atmospheric profile is a representation of the vertical distribution of key
atmospheric parameters under standard atmospheric conditions. These conditions are
typically used as a reference for various scientific and engineering purposes. The standard
atmospheric profile provides information on temperature, pressure, density, and other
atmospheric properties at different altitudes.
Here is a brief overview of the standard atmospheric profile according to this model:
1. Troposphere (0 to 11 kilometres or 0 to 36,090 feet):
 Temperature: Generally decreases with altitude.
 Pressure: Decreases exponentially with altitude.
 Density: Decreases with altitude.
 Contains about 75-80% of the total atmospheric mass.
 Most weather events occur in this layer.
2. Stratosphere (11 to 50 kilometres or 36,090 to 164,042 feet):
Temperature: Initially remains constant, then increases with altitude due to the

www.EnggTree.com
presence of the ozone layer.
 Pressure: Decreases with altitude.
 Density: Decreases with altitude.
 Contains the ozone layer, which absorbs and scatters ultraviolet (UV) solar
radiation.
3. Mesosphere (50 to 85 kilometres or 164,042 to 278,871 feet):
 Temperature: Decreases with altitude.
 Pressure: Decreases with altitude.
 Density: Decreases with altitude.
 Where most meteorites burn up upon entering the Earth's atmosphere.
4. Thermosphere (85 to 600 kilometres or 278,871 to 1,968,504 feet):
 Temperature: Increases significantly with altitude due to the absorption of
high-energy solar radiation.
 Pressure: Extremely low, almost a vacuum.
 Density: Extremely low, with individual molecules widely spaced.
 The region where the International Space Station (ISS) orbits.

Downloaded from EnggTree.com


EnggTree.com

www.EnggTree.com

Main Atmospheric Regions and its Characteristics:

The Earth's atmosphere is divided into several main regions, each with distinct characteristics
in terms of temperature, pressure, composition, and other properties.
1. Troposphere:
 Altitude: 0 to approximately 8-15 kilometers (0 to 5-9 miles).
 Characteristics:
 Decreasing temperature with altitude.
 Where weather events, including clouds and precipitation, occur.
 Contains the majority of the Earth's atmospheric mass.

Downloaded from EnggTree.com


EnggTree.com

2. Stratosphere:
 Altitude: Approximately 15 to 50 kilometers (9 to 31 miles).
 Characteristics:
 Temperature generally increases with altitude due to the presence of
the ozone layer.
 Contains the ozone layer, which absorbs and scatters ultraviolet (UV)
solar radiation.
 Jet streams are found in the upper part of this layer.
3. Mesosphere:
 Altitude: Approximately 50 to 85 kilometers (31 to 53 miles).
 Characteristics:
 Decreasing temperature with altitude.
 The region where meteorites burn up upon entering the Earth's
atmosphere.
 Thermospheric temperatures decrease with altitude.
4. Thermosphere:
 www.EnggTree.com
Altitude: Approximately 85 kilometers and extends upward to about 600
kilometers (53 miles to about 373 miles).
 Characteristics:
 Temperature increases significantly with altitude due to the absorption
of high-energy solar radiation.
 Extremely low pressure and density.
 The region where auroras occur.
5. Exosphere:
 Altitude: Beyond 600 kilometers (373 miles) and extends into space.
 Characteristics:
 Gradual transition to the vacuum of space.
 Very low density of gas particles.
 Satellites orbit in this region.

Downloaded from EnggTree.com


EnggTree.com

Interaction of radiation with atmosphere – Scattering, absorption and refraction

EMR interactions with the Earth’s atmosphere and surface


After electromagnetic radiation has been created by the Sun, the part of it that has found its
way through the vacuum of space to the top of the Earth’s atmosphere must pass through the
atmosphere, be reflected by the Earth’s surface, pass through the atmosphere again on its way
back to space, and then arrive at the sensor in order to be recorded. While nothing happens to
the radiation field as it passes through empty space, several things happen as it interacts with
the Earth’s atmosphere and surface. It is due to these interactions that the measured radiation
ends up containing information about the Earth environment, so it is important to take a
closer look at exactly what happens in these interactions, and how it affects the radiation
field.
Interactions with the atmosphere
The interaction between electromagnetic radiation and the Earth’s atmosphere can be
considered to have three components: refraction that changes the direction of propagation of
the radiation field due to density differences between outer space and the
atmosphere, scattering that changes the direction of propagation of individual photons as they
are absorbed and re-emitted by gasses or aerosols or other atmospheric constituents without
changing wavelength, and absorption that convert photons into vibrations in a molecule,
energy which is (later) re-emitted as one or more photons with longer wavelength(s). Each
will be considered in more detailwww.EnggTree.com
below.
Refraction
Refraction is the bending (and slowing down) of the direction of propagation of
electromagnetic radiation as it moves between two media with different densities. This
happens as radiation arrives from outer space (density ≈0) and enters the atmosphere (density
>0). The angle at which the direction of propagation changes is determined by the refractive
indices of the two media. The refractive index of a medium (n) is determined as the ratio of
the speed of electromagnetic radiation in a vacuum (c) to the similar speed in the medium
(cn): n=c/cn. The refractive index of a standard atmosphere is 1.0003, while the refractive
index of water is 1.33. Using the refractive indices of the two media, the amount of refraction
can be determined with Snell’s Law: n1 * sinΘ1 = n2 * sinΘ2.
where n are the refractive indices of the two media and Θ are the angles at which the
direction of propagation intersects the normal of the surface separating the two media (Figure
22). Refraction is rarely a relevant factor in the practical use of remote sensing data. Its only
important influence concerns the georeferencing of imagery collected when the Sun is close
to the horizon, and this is a problem that is nearly always dealt with by the image provider.
One important situation in which refraction is important and must be considered is when an
image analyst needs to precisely geolocate underwater objects (such as features on the
seafloor in coastal areas).

Downloaded from EnggTree.com


EnggTree.com

One of the two remaining processes that influence electromagnetic radiation as it passes
through the atmosphere is scattering. Scattering happens when a photon interacts with
something in the atmosphere that causes it to change direction. Depending on the size of the
object that the photon interacts with, two distinct types of scattering are
recognized. Rayleigh scattering happens when the object is much smaller than the wavelength
of the radiation. In the case of sunlight and the Earth’s atmosphere this means that Rayleigh
scattering is cause by atmospheric gasses like N 2, O2, CO2 etc. Mie scattering happens when
www.EnggTree.com
the object is similar in size to the wavelength of the radiation, which means that it is caused
by aerosols like smoke and dust particles. Additional scattering can happen if radiation
interacts with particles larger in size than its wavelength, like water droplets or sand particles.
While refraction is predictable and can be determined by Snell’s Law, scattering is an
inherently stochastic process: what happens to an individual photon as it passes through the
atmosphere is entirely unpredictable, including whether or not it experiences any scattering,
and if so which direction it is reemitted in. However, the magnitude and direction of
scattering that happens on average to the many photons in a radiation field is predictable.
Rayleigh scattering

Downloaded from EnggTree.com


EnggTree.com

A fact that has great importance for remote sensing of the Earth is that the magnitude of
Rayleigh scattering is inversely related to the 4th power of the wavelength of the radiation. In
other words, radiation with shorter wavelengths is scattered much more by Rayleigh

www.EnggTree.com

scattering than radiation at longer wavelengths. In the visible wavelengths, this means that
blue light is scattered more than green light, which in turn is scattered more than red light.
This is the process that makes the Earth’s oceans look blue when viewed from space. What
happens is that over very dark Earth surfaces, such as the oceans, the majority of radiation
reaching the Earth surface is absorbed rather than reflected by it. What is visible from space
is thus not radiation reflected by the surface, but rather radiation scattering from within the
atmosphere. Because blue wavelengths are those most strongly scattered through Rayleigh
scattering, this scattered radiation as a whole looks blue (Figure 23). Another effect of
Rayleigh scattering is that regardless of what is on the Earth’s surface, a space-based sensor
will detect a substantial amount of blue light coming from the Earth-Atmosphere system.
This can be a problem because the ‘blue signal’ form the atmosphere overwhelms variations
in ‘blue reflectance’ on the surface. But it can also be an advantage, because measurements in
the blue wavelengths can help assess the strength of Rayleigh scattering across the visible and

Downloaded from EnggTree.com


EnggTree.com

infrared spectrum, which can in turn be corrected for. This is the basis for the ‘aerosol’ band
that was included on Landsat 8 OLI (but was not found on its predecessor instruments), on
Sentinel-2, and on the WorldView-2 and -3 sensors.
.
While any scattering in the atmosphere is a source of noise (for those interested in using
satellite imagery to characterize the Earth’s surface), Rayleigh scattering is a relatively
benign source of noise because its wavelength dependence makes it largely predictable, and
because the gasses responsible for it tend to have stable concentrations across space and time.
Rayleigh scattering is therefore not a source of great uncertainty for most remote sensing
applications.
Mie scattering
Mie scattering, because its strength and wavelength dependence depends on the type and
density of the particulates that cause it to happen, varies substantially through time and space.
As a result it is one of the most important causes of uncertainty in remote sensing, especially
when using satellite data to study dark parts of the Earth’s surface from which the amount of
reflected radiation is small relative to the total signal from atmospheric scattering. For the
same reason it is hard to generalize its importance, but broadly speaking the strength of Mie
scattering exceeds that of Rayleigh scattering, and while it still diminishes with increasing
wavelength its influence extends further into the infrared spectrum. Because Mie scattering is
caused by atmospheric particulates, it is often dramatically increased during dust storms,
forest fires, or other events that caused the atmospheric aerosol load to increase. One such
example is seen in Figure 24. www.EnggTree.com

Downloaded from EnggTree.com


EnggTree.com

www.EnggTree.com

Absorption
The last important thing that happens to electromagnetic radiation as it passes through the
atmosphere is that it is partially absorbed by atmospheric gasses (mostly H 2O, CO2 and O3).
While the energy absorbed is ultimately re-emitted by these gas molecules, the re-emission
happens at wavelengths typically outside the spectrum considered in optical remote sensing
(but which may be important for thermal remote sensing), so for practical purposes the
absorbed photons can be considered gone when absorbed. The strength of absorption is
highly dependent on wavelength because it happens most easily when the radiation has a
wavelength (frequency) that is similar to a resonant frequency of the gas doing the
absorption, which in turn depends on its atomic or molecular structure. For example, due to
its molecular structure, O2 is particularly good at absorbing electromagnetic radiation with
wavelengths right around 760 nm, but not at 750 or 770 nm. Similar wavelengths exist at
which other gasses are effective or not at absorbing EMR, and in combination the
atmospheric gasses let some wavelengths pass through the atmosphere with almost no
absorption, while other wavelengths are almost entirely absorbed before they reach the
surface of the Earth (Figure 25 and Figure 26). As is especially clear in Figure 26, water
vapour is responsible for much of the total gaseous absorption of EMR in the atmosphere,
including in the visible spectrum (not clearly shown on that figure). This is an important
challenge for remote sensing because while the concentrations of the other gasses are

Downloaded from EnggTree.com


EnggTree.com

relatively stable through time and space, water vapour concentrations vary greatly through
time (humid vs. dry days) and through space (dry arctic vs. humid tropical

Interactions with the surface


The part of the radiation field that has made it through the atmosphere without being
absorbed or scattered back toward space now reaches the Earth’s surface. For any wavelength
that is of relevance to remote sensing, only one of two things can now happen to each
individual photon – it can be absorbed by the Earth’s surface, or it can be reflected back
toward space. The probability of reflection rather than absorption happening is termed the
reflectance of the surface, and it depends on the material on the surface as well as the
wavelength of the incoming radiation. Each surface material has a unique ‘signature’ that

www.EnggTree.com

defines what proportion of radiation is reflected for each wavelength. For example, water
reflects a small amount of blue and green wavelengths (typically around 5% – 10%
depending on turbidity), less of the red wavelengths, and almost nothing in the infrared
wavelengths. Vegetation, on the other hand, reflected around half of all incoming infrared
radiation, except for specific wavelengths that are effectively absorbed by liquid water in the
leaves. These spectral signatures are commonly portrayed as graphs, with wavelengths along
the x-axis and reflectance along the y-axis (as in Figure 27).

Downloaded from EnggTree.com


EnggTree.com

Spectral signatures are what enables us to differentiate between different materials on the
Earth’s surface when we look at a satellite image. As shown in Figure 27, water has near-zero
reflectance at wavelengths longer than 0.7 μm (700 nm), while both soil and green vegetation

has reflectances around 40% at 1.3 μm. Measuring the amount of radiation reflected off the
Earth-Atmosphere system at 1.3 μm will thus be particularly helpful at differentiating water
from the two terrestrial surface types. Similarly, measurements at wavelengths around 1.4 μm
(where liquid water in vegetation is a strong absorber) or 1.9 μm (same) can be effective to
www.EnggTree.com
differentiate between soil and green vegetation.
As a more detailed example, spectral signatures have been effective for large-scale geological
surveying/prospecting because different minerals (that may be characteristic of different sub-
surface conditions) can be identified through their unique spectral signatures (Figure 28).

Downloaded from EnggTree.com


EnggTree.com

The part of the radiation field that is reflected by the Earth’s surface must naturally make its
way back up through the atmosphere, with the attendant refraction, scattering and absorption,
before it can be measured by any space-based sensor. While there are many relative
advantages and disadvantages to air-borne vs. space-borne sensors, the ability of air-borne
sensors to measure the reflected EMR field before it has had to pass through the atmosphere a
second time is one distinct advantage.

Atmospheric Windows:
"atmospheric windows" refer to specific wavelength ranges in the electromagnetic spectrum
where the Earth's atmosphere is relatively transparent, allowing certain types of
electromagnetic radiation to pass through with minimal absorption or interference. These
windows are crucial for observations and measurements from ground-based or space-based
instruments, as they enable the study of celestial objects, weather patterns, and various Earth
processes. Different atmospheric windows exist for different regions of the electromagnetic
spectrum. Some key atmospheric windows include:

www.EnggTree.com

1. Visible Light:
 The atmosphere is highly transparent to visible light, allowing sunlight to
reach the Earth's surface. This transparency is essential for human vision and
for various optical observations.
2. Near-Infrared (NIR):

Downloaded from EnggTree.com


EnggTree.com

 Certain wavelengths in the near-infrared region are relatively transparent in


the Earth's atmosphere. This transparency is exploited in remote sensing
applications, such as satellite imagery, where NIR observations provide
valuable information about vegetation health, cloud cover, and surface
characteristics.
3. Shortwave Infrared (SWIR):
 Similar to NIR, the shortwave infrared region also has atmospheric windows
that allow certain wavelengths to pass through. This is useful for various
remote sensing applications, including geological studies and moisture content
measurements.
4. Radio Waves:
 In the radio frequency range, there are specific atmospheric windows that
allow radio waves to propagate efficiently. This is important for radio
astronomy and communication purposes.
5. Microwaves:
 Microwaves have specific atmospheric windows that are exploited in
applications such as weather radar, microwave ovens, and satellite
communication.
6. Millimeter and Submillimeter Waves:

www.EnggTree.com
Certain atmospheric windows exist for observing millimeter and submillimeter
waves. These windows are crucial for studying molecular transitions in the
Earth's atmosphere and for astronomical observations.
astronomers choose specific wavelengths that are transparent to the Earth's atmosphere for
ground-based and space-based observations to minimize distortion and absorption. Remote
sensing applications, including satellite observations and weather monitoring, also benefit
from knowledge about atmospheric windows to capture accurate and meaningful data.
Energy Balance Equation:
energy balance equation is a fundamental principle in thermodynamics and physics,
expressing the conservation of energy within a system. It is commonly used in various fields,
including physics, engineering, and environmental science. The general form of the energy
balance equation can be stated as:
Energy In−Energy Out=Energy Stored Energy Lost
In more specific terms, considering a closed system, the equation can be written as:

Downloaded from EnggTree.com


EnggTree.com

Σin−Σout=Δsystem

Where:
 Σin represents the sum of all forms of energy entering the system.
 Σout represents the sum of all forms of energy leaving the system.
 Δsystem represents the change in internal energy of the system.
This equation is based on the first law of thermodynamics, which states that energy cannot be
created or destroyed; it can only change forms. The energy balance equation helps analyze
and quantify the flow of energy in a given system, whether it's a physical process, a chemical
reaction, or an environmental system.
www.EnggTree.com
Specular and diffuse reflectors:
1. Specular Reflectance in Remote Sensing:
 Characteristics:
 Specular reflection is associated with smooth and reflective surfaces.
 Light reflects off such surfaces in a specific direction, following the
law of reflection.
 Applications:
 Specular reflection is significant in the study of water bodies. For
instance, the sun's reflection on a calm water surface can create
specular highlights.
 It is relevant for monitoring highly reflective surfaces such as glass,
metal, or other smooth materials.
2. Diffuse Reflectance in Remote Sensing:
 Characteristics:
 Diffuse reflection involves the scattering of light in various directions,
typical of rough or non-reflective surfaces.

Downloaded from EnggTree.com


EnggTree.com

 Light interacts with the surface irregularities, scattering in multiple


directions.
 Applications:
 Most natural surfaces, such as vegetation, soil, and rocky terrain,
exhibit diffuse reflectance.
 Diffuse reflection is crucial for assessing land cover types, as different
materials have distinct diffuse reflectance signatures.
 It is used in the analysis of urban areas where building materials and
land cover can vary widely.
Remote sensing instruments, such as satellites or airborne sensors, capture the reflected
energy from surfaces. Spectral signatures derived from these reflections help identify and
classify different materials on the Earth's surface. Analyzing the variations in reflectance
properties allows remote sensing scientists to interpret land cover, identify changes, and
monitor environmental conditions.

www.EnggTree.com

Spectral reflectance & emittance– Spectroradiometer:

Spectral Reflectance in Remote Sensing:


 Definition: Spectral reflectance refers to the ratio of the reflected light intensity from
a surface at a particular wavelength to the incident light intensity at the same
wavelength.
 Importance: Different materials have unique spectral reflectance signatures, and these
signatures are exploited in remote sensing to identify and classify land cover types.

Downloaded from EnggTree.com


EnggTree.com

 Measurement: Spectral reflectance is often measured across various wavelengths,


forming a spectral reflectance curve or spectrum for a particular material. These
curves help create spectral libraries for different surfaces.
Spectral Emittance in Remote Sensing:

 Definition: Spectral emittance is the ratio of the radiant exitance of a surface (emitted
radiation) at a particular wavelength to the radiant exitance of a perfect blackbody at
the same temperature.

 Importance: Spectral emittance is crucial for studying the thermal properties of


surfaces. Different materials emit thermal radiation in distinct ways, and this
information is valuable for applications such as land surface temperature estimation.

 Measurement: Similar to spectral reflectance, spectral emittance is measured across


different wavelengths, forming an emittance spectrum. This spectrum helps
understand how much thermal radiation is emitted by a surface at different
wavelengths.
Spectroradiometer in Remote Sensing:
 www.EnggTree.com
Definition: A spectroradiometer is an instrument used to measure the intensity of
radiation as a function of wavelength. It measures both the spectral reflectance and
emittance of surfaces.

 Components: A spectroradiometer typically includes a spectrometer, which disperses


light into its different wavelengths, and a radiometer, which measures the intensity of
light at those wavelengths.

 Applications: Spectroradiometers are employed in various remote sensing


applications, including satellite and airborne sensors. They help collect detailed
information about the spectral characteristics of the Earth's surface, aiding in the
identification of materials and the assessment of environmental conditions.

In summary, spectral reflectance and emittance, measured by spectroradiometers, are crucial


components of remote sensing. They provide valuable information about the interaction of
electromagnetic radiation with the Earth's surface, enabling scientists to characterize and
analyse different land cover types and understand thermal properties.

Downloaded from EnggTree.com


EnggTree.com

Spectral Signature concepts:


Spectral signature is a fundamental concept in remote sensing, describing the unique pattern
of energy reflected, emitted, or transmitted by an object or surface across different
wavelengths. Each material has a distinct spectral signature, and these signatures are used to
identify and classify land cover types, monitor changes, and extract valuable information in
remote sensing applications. Here are key concepts related to spectral signatures in remote
sensing:

1. Spectral Bands:
 Remote sensing instruments, such as satellites or airborne sensors, are
equipped with sensors that capture electromagnetic radiation in specific bands
or ranges of the electromagnetic spectrum.
 Each spectral band corresponds to a particular range of wavelengths, and the
combination of bands forms the spectral signature of a surface.

2. Reflectance and Absorption:


 Materials interact with incoming radiation in different ways. Some materials
www.EnggTree.com
absorb certain wavelengths, while others reflect or transmit them.
 The reflectance and absorption characteristics of materials contribute to the
shape of their spectral signature.

3. Spectral Reflectance Curves:


 The spectral signature is often represented by a spectral reflectance curve,
which shows the reflectance values of a material across different wavelengths.
 Peaks and valleys in the curve correspond to specific features or absorption
bands associated with certain materials.

4. Feature Identification:
 Spectral signatures are used to identify and discriminate between different
land cover features. For example, vegetation, water bodies, soil, and urban
areas have distinct spectral signatures.
 Comparing the spectral signature of an unknown area to spectral libraries
helps in feature identification.

Downloaded from EnggTree.com


EnggTree.com

5. Temporal Variability:
 Spectral signatures can vary over time due to seasonal changes, growth cycles,
or other environmental factors.
 Monitoring temporal changes in spectral signatures is valuable for tracking
land cover dynamics and assessing environmental conditions.

6. Supervised and Unsupervised Classification:


 In image classification, spectral signatures play a crucial role. Supervised
classification involves training a classifier using known spectral signatures,
while unsupervised classification identifies clusters of similar spectral
signatures without predefined classes.
7. Multispectral and Hyperspectral Imaging:
 Multispectral sensors capture data in a few distinct bands, while hyperspectral
sensors capture data in numerous narrow bands, providing more detailed
spectral information.
 Hyperspectral imaging allows for finer discrimination between materials with
similar spectral signatures.
www.EnggTree.com
analysing spectral signatures are essential for extracting meaningful information from remote
sensing data, enabling applications such as land cover mapping, environmental monitoring,
and resource management.

Typical spectral reflectance curves for vegetation, soil and water:


Spectral reflectance curves for vegetation, soil, and water exhibit distinct patterns across
different wavelengths of the electromagnetic spectrum. These characteristic curves help in the
identification and classification of land cover types in remote sensing. Here are typical
spectral reflectance curves for vegetation, soil, and water:
1. Vegetation:
 Characteristics:
 Vegetation strongly absorbs radiation in the red and blue parts of the
spectrum due to the presence of chlorophyll.
 It reflects strongly in the near-infrared (NIR) region, resulting in a peak
in reflectance.
 The red-edge region (around 700-750 nm) is often characterized by a
distinctive change in reflectance due to chlorophyll absorption.

Downloaded from EnggTree.com


EnggTree.com

 Typical Spectral Reflectance Curve for Vegetation:


2. Soil:
 Characteristics:
 Soil typically has lower reflectance in the visible and higher
reflectance in the NIR.
 Absorption features in the shortwave infrared (SWIR) can be
associated with minerals present in the soil.
 Typical Spectral Reflectance Curve for Soil:
3. Water:
 Characteristics:
 Water absorbs radiation in the visible spectrum, particularly in the blue
and red wavelengths.
 Near-infrared is generally reflected, but absorption occurs in the
SWIR.
 Clear water can show high reflectance in the NIR, but this varies with
water quality and constituents.
 Typical Spectral Reflectance Curve for Water:
www.EnggTree.com
These curves illustrate the distinctive patterns associated with each land cover type, allowing
remote sensing scientists to use spectral signatures for classification and mapping. It's
important to note that variations in these spectral signatures can occur due to factors such as
vegetation health, soil moisture content, and water quality, influencing the overall reflectance
characteristics observed in remote sensing data.

Solid surface scattering in microwave region:

Solid surface scattering in the microwave region refers to the interaction of microwave
radiation with the Earth's surface when it is predominantly composed of solid materials, such
as soil, rock, or man-made structures. Microwave remote sensing, particularly in the
microwave region of the electromagnetic spectrum, has applications in fields like radar

Downloaded from EnggTree.com


EnggTree.com

imaging, soil moisture estimation, and geological studies. Here are some key points related to
solid surface scattering in the microwave region:
1. Surface Roughness:
 The interaction between microwaves and a solid surface is influenced by the
roughness of the surface.
 In the microwave region, rough surfaces can cause scattering of the incident
radiation in various directions.
2. Frequency Dependence:
 The behavior of solid surface scattering depends on the frequency of the
microwaves.
 Higher frequency microwaves tend to interact more with the surface
roughness, leading to increased scattering effects.
3. Radar Cross Section (RCS):
 The Radar Cross Section is a measure of how well a target reflects radar
signals.
 Solid surfaces with irregularities or roughness can have a significant impact on
the RCS, influencing the detectability of objects.
4. Vegetation and Dielectric Properties:
www.EnggTree.com
 In areas with vegetation cover, the interaction of microwaves with leaves and
branches can also contribute to scattering.
 The dielectric properties of the solid materials play a role in determining how
microwaves penetrate or interact with the surface.
5. Soil Moisture Sensing:
 Microwave remote sensing is often used to estimate soil moisture content, as
the interaction between microwaves and the soil surface is influenced by its
moisture content.
 Moisture affects the dielectric properties of soil, impacting the scattering and
absorption of microwaves.
6. Geological Applications:
 In geological studies, microwave remote sensing can be used to analyze the
composition and structure of solid surfaces.
 Differences in the microwave response can help identify geological features
and material types.
7. Synthetic Aperture Radar (SAR):
 Synthetic Aperture Radar is a type of radar used in microwave remote sensing.

Downloaded from EnggTree.com


EnggTree.com

 SAR systems utilize microwave signals to create high-resolution images of the


Earth's surface, and solid surface scattering is a critical factor in SAR signal
interactions.
Solid surface scattering in the microwave region is essential for interpreting microwave
remote sensing data and extracting meaningful information about the Earth's surface,
especially in applications related to agriculture, hydrology, geology, and environmental
monitoring.

www.EnggTree.com

Downloaded from EnggTree.com


EnggTree.com

UNIT 3
ORBITS AND PLATFORMS
INTRODUCTION
Remote sensing has revolutionized our ability to observe and understand the Earth's surface
and atmosphere from afar. By utilizing various platforms and orbits, remote sensing
technologies enable us to gather valuable data for a wide range of applications, including
environmental monitoring, disaster management, urban planning, agriculture, and climate
studies. In this introduction, we will explore the fundamentals of orbits and platforms used in
remote sensing and their significance in acquiring high-quality data.
Orbits
Orbits play a critical role in remote sensing missions as they determine the trajectory and
coverage of the satellite or sensor system.
TYPES
1. Low Earth Orbit (LEO)
2. Geostationary Orbit (GEO)
3. Polar Orbit
4. Sun-Synchronous Orbit (SSO)

www.EnggTree.com

Low Earth Orbit (LEO): Satellites in LEO typically orbit at altitudes ranging from 160 to
2,000 kilometers above the Earth's surface. These orbits provide high spatial resolution
imagery and frequent revisits to specific locations due to their relatively short orbital periods.
Geostationary Orbit (GEO): Satellites in GEO orbit at an altitude of approximately 35,786
kilometers above the equator. They maintain a fixed position relative to the Earth's surface,
making them ideal for continuous monitoring of specific regions, such as weather patterns
and environmental changes.

Downloaded from EnggTree.com


EnggTree.com

Polar Orbit: Polar orbiting satellites pass over the Earth's poles, providing global coverage
with each orbit. These orbits are commonly used for environmental monitoring, as they allow
for comprehensive observations of land, oceans, and atmosphere over time.
Sun-Synchronous Orbit (SSO): Satellites in SSO maintain a constant angle relative to the Sun
as they orbit the Earth, ensuring consistent lighting conditions during each pass over the same
area. This orbit is particularly useful for monitoring changes in vegetation, land use, and
other surface features.
Platforms:
Remote sensing platforms encompass a variety of vehicles or devices used to carry sensors
into the Earth's atmosphere or space. Some common platforms include:
Satellites: Satellites are spacecraft placed into orbit around the Earth or other celestial bodies.
They house remote sensing instruments that capture data across different wavelengths of the
electromagnetic spectrum.
Unmanned Aerial Vehicles (UAVs): UAVs, or drones, are aircraft operated without a human
pilot onboard. They are equipped with sensors capable of capturing high-resolution imagery
and collecting data over targeted areas with flexibility and cost-effectiveness.
Aircraft: Manned aircraft equipped with remote sensing instruments are used for airborne
data collection at various altitudes. These platforms offer higher spatial resolution compared
to satellite-based sensors and can be deployed for specialized missions or rapid response
tasks.
www.EnggTree.com
Ground-Based Platforms: Ground-based sensors and observatories are stationed on the
Earth's surface or mounted on fixed structures. They provide continuous monitoring of
specific locations and contribute to validating data collected from airborne or satellite
platforms.
MOTIONS OF PLANETS AND SATELLITIES
The motions of planets and satellites play a crucial role in remote sensing applications,
influencing the positioning, coverage, and data acquisition capabilities of remote sensing
instruments. Understanding these motions is essential for optimizing the design and operation
of remote sensing missions.

Downloaded from EnggTree.com


EnggTree.com

Rotational Motion of Planets:


Planets rotate on their axes, causing day-night cycles and altering the illumination conditions
for remote sensing observations.
Remote sensing instruments need www.EnggTree.com
to account for this rotation to ensure consistent lighting
conditions and accurate data collection.
For example, satellites in sun-synchronous orbits are synchronized with the Earth's rotation,
ensuring consistent solar illumination during each orbit pass.
Orbital Motion of Planets:
Planets orbit around the Sun in elliptical paths according to Kepler's laws of planetary
motion.
The orbit of a planet affects the positioning and geometry of remote sensing platforms, such
as satellites orbiting Earth.
Remote sensing missions need to consider the orbital parameters of planets to optimize
coverage, revisit times, and data acquisition strategies.
Satellite Motion:
Satellites used for remote sensing orbit around planets like Earth, Mars, or other celestial
bodies.
Different types of orbits, including polar, geostationary, and sun-synchronous, influence
satellite motion and coverage patterns.
Satellites may also exhibit additional motions such as precession, nutation, and orbital drift,
which must be accounted for in mission planning and data processing.

Downloaded from EnggTree.com


EnggTree.com

Relative Motion Between satellite and planets:


It affects the viewing geometry and spatial resolution of remote sensing observations.
Satellites may pass over different latitudes and longitudes on the planet's surface during each
orbit, influencing the spatial coverage and distribution of acquired data.
Motion Correction techniques:
To mitigate the effects of planetary and satellite motions on remote sensing data, various
correction techniques are employed.
These techniques include georeferencing, orthorectification, and image registration
algorithms, which compensate for geometric distortions caused by motion and terrain
variations.
Motion correction ensures accurate alignment of remote sensing images with geographic
coordinates, facilitating quantitative analysis and integration with other spatial datasets.
NEWTONS LAW OF GRAVITATION
Newton's law of gravitation states that every particle in the universe attracts every other
particle with a force that is directly proportional to the product of their masses and inversely
proportional to the square of the distance between their centers. Mathematically, it can be
expressed as:
F=Gxm1xm2/r2
Where,
www.EnggTree.com
F is the gravitational force between two objects,
G is the gravitational constant
(approximately 6.674×10−11 m3 kg−1 s−2),
m1 and m2 are the masses of the two objects, and
r is the distance between the centers of the two objects.

Satellite Orbits:

Downloaded from EnggTree.com


EnggTree.com

Newton's law of gravitation governs the motion of satellites in orbit around celestial bodies,
such as the Earth. The gravitational force between the satellite and the Earth determines the
shape, size, and stability of the satellite's orbit. Remote sensing satellites rely on specific
orbits, such as polar orbits or geostationary orbits, to achieve desired coverage and revisit
times.
Trajectory Planning:
Understanding the gravitational interactions between the satellite and other celestial bodies
(e.g., the Moon, the Sun) is crucial for trajectory planning in remote sensing missions. By
accounting for gravitational forces, mission planners can optimize satellite paths, minimize
fuel consumption, and ensure accurate positioning for data acquisition.
Orbital Dynamics:
Newton's law of gravitation influences various orbital parameters, including eccentricity,
inclination, and periapsis. These parameters dictate the orbital characteristics of remote
sensing satellites, such as their altitude, orbital period, and ground track. Precise control of
these parameters is essential for achieving desired observational objectives and optimizing
data collection strategies.
Gravitational Perturbations:
Gravitational perturbations from other celestial bodies can affect satellite orbits over time.
These perturbations may cause orbital precession, nodal regression, or secular drift, which
can impact the long-term stability and operational lifespan of remote sensing missions.
www.EnggTree.com
Understanding and mitigating these effects are vital for maintaining satellite performance and
data continuity.
GRAVITATIONAL FIELD AND POTENTIAL
The gravitational field and potential are fundamental aspects of Earth's geophysical
environment that influence remote sensing measurements and data interpretation. This
concepts is crucial for accurately analyzing remote sensing data and extracting meaningful
information about the Earth's surface and subsurface features.

Downloaded from EnggTree.com


EnggTree.com

Influence on Satellite Orbits:


www.EnggTree.com
The gravitational field of the Earth affects the orbits of satellites used in remote sensing
missions. Satellites orbiting the Earth experience gravitational forces that determine their
trajectory, altitude, and velocity.
Variations in the Earth's gravitational field due to uneven mass distribution (e.g., mountains,
oceans, and variations in the density of underlying geological structures) can perturb satellite
orbits, affecting their stability and accuracy in acquiring remote sensing data.
Geoid Modeling:
The geoid represents the equipotential surface of Earth's gravity field that best fits global
mean sea level. It serves as a reference surface for measuring elevations and understanding
the Earth's shape and gravity field.
Remote sensing techniques, such as satellite gravimetry and altimetry, are employed to
precisely measure variations in the geoid. These measurements contribute to refining geoid
models, which are essential for geodetic applications, including mapping, navigation, and
geophysical studies.
Gravity Anomalies:
Gravity anomalies are deviations from the average gravitational field of the Earth and are
indicative of subsurface geologic structures, such as sedimentary basins, volcanic features,
and mineral deposits.

Downloaded from EnggTree.com


EnggTree.com

Remote sensing technologies, such as satellite gravimetry and airborne gravity surveys, are
used to map gravity anomalies with high spatial resolution. These data aid in geological
mapping, mineral exploration, and understanding tectonic processes.
Subsurface Characterization:
Gravitational data, when integrated with other remote sensing datasets (e.g., multispectral
imagery, radar data), can provide valuable insights into subsurface characteristics, such as
lithology, density variations, and groundwater resources.
Gravity surveys, combined with geophysical inversion techniques, enable the estimation of
subsurface properties and the delineation of geological structures, facilitating resource
exploration and environmental assessment.
ESCAPE VELOCITY
Escape velocity is a concept in physics referring to the minimum velocity an object needs to
escape the gravitational pull of a massive body, such as a planet or a moon, without being
propelled further by additional force. In the context of remote sensing, escape velocity is not
directly relevant because remote sensing typically involves objects, such as satellites or
drones, that are intentionally placed into orbit around the Earth rather than being launched
into space or escaping Earth's gravitational field entirely.
KEPLER’S LAW OF PLANETARY MOTION
Kepler's laws of planetary motion are a set of three fundamental principles describing the
www.EnggTree.com
motion of planets and other celestial bodies around the Sun. While these laws are primarily
concerned with the dynamics of celestial bodies in the solar system, they have implications
for remote sensing, particularly in the context of satellite orbits and orbital dynamics.

Downloaded from EnggTree.com


EnggTree.com

Kepler's First Law (Law of Ellipses):


Kepler's First Law states that the orbit of a planet around the Sun is an ellipse with the Sun at
one of the two foci.
In the context of remote sensing, satellites can be placed into various types of orbits around
the Earth, including elliptical orbits. While circular orbits are often preferred for their
simplicity and stability, certain missions may benefit from elliptical orbits, such as those
designed for polar observation or high-resolution imaging over specific regions.
Kepler's Second Law (Law of Equal Areas):
Kepler's Second Law states that a line segment joining a planet and the Sun sweeps out equal
areas during equal intervals of time.
This law implies that planets move faster when they are closer to the Sun (at perihelion) and
slower when they are farther away (at aphelion). Similarly, satellites in elliptical orbits
around the Earth experience variations in orbital velocity as they move closer to or farther
away from the planet.
Remote sensing satellites in elliptical orbits may encounter changes in orbital velocity and
ground speed, which can affect the timing and coverage of data acquisition over different
regions of the Earth's surface.
Kepler's Third Law (Law of Harmonies):
Kepler's Third Law relates the orbital period of a planet (or satellite) to its average distance
www.EnggTree.com
from the Sun (or central body). Specifically, the square of the orbital period is proportional to
the cube of the semi-major axis of the orbit.
In remote sensing, Kepler's Third Law influences the design and planning of satellite
missions. For example, satellites in higher orbits have longer orbital periods, resulting in
fewer revisits to specific locations on Earth but providing broader coverage. Conversely,
satellites in lower orbits have shorter orbital periods, leading to more frequent revisits but
narrower coverage swaths.
ORBIT ELEMNTS AND TYPES
Orbital elements and types play a crucial role in remote sensing missions, determining the
trajectory, coverage, and observational characteristics of satellites or sensors. Understanding
these elements and types is essential for designing, planning, and operating remote sensing
missions effectively.

Downloaded from EnggTree.com


EnggTree.com

ELEMENTS
www.EnggTree.com
Semi-Major Axis (a):
The semi-major axis is half of the major axis of an ellipse representing the orbit. It defines the
average distance between the satellite and the center of the Earth.
Eccentricity (e):
Eccentricity measures the deviation of an orbit from a perfect circle. It ranges from 0 (circular
orbit) to 1 (highly elliptical orbit), determining the shape of the orbit.
Inclination (i):
Inclination is the angle between the orbital plane and the equatorial plane of the Earth. It
defines the orientation of the orbit relative to the Earth's rotation axis.
Right Ascension of the Ascending Node (RAAN):
RAAN is the angle measured from a reference direction (typically the vernal equinox) to the
point where the orbit crosses the equatorial plane from south to north.
Argument of Perigee (ω):
The argument of perigee is the angle measured from the ascending node to the point of
closest approach (perigee) to the Earth's surface.
True Anomaly (ν):

Downloaded from EnggTree.com


EnggTree.com

True anomaly is the angle measured from the perigee to the current position of the satellite,
defining its position along the orbit.
TYPES
1. Low Earth Orbit (LEO)
2. Geostationary Orbit (GEO)
3. Polar Orbit
4. Sun-Synchronous Orbit (SSO)
5. Molniya Orbit
6. Highly Elliptical Orbit (HEO)
Low Earth Orbit (LEO):
Satellites in LEO typically orbit at altitudes ranging from 160 to 2,000 kilometers above the
Earth's surface. LEOs offer high spatial resolution imagery and frequent revisits to specific
locations due to their short orbital periods.
Geostationary Orbit (GEO):
Satellites in GEO orbit at an altitude of approximately 35,786 kilometers above the equator.
They remain stationary relative to the Earth's surface, providing continuous monitoring of
specific regions, such as weather patterns.
Polar Orbit:
Polar orbiting satellites pass over the Earth's poles, providing global coverage with each orbit.
www.EnggTree.com
They are commonly used for environmental monitoring and scientific research due to their
comprehensive observational capabilities.
Sun-Synchronous Orbit (SSO):
Satellites in SSO maintain a constant angle relative to the Sun as they orbit the Earth,
ensuring consistent lighting conditions during each pass over the same area. SSOs are
suitable for monitoring changes in vegetation, land use, and climate.
Molniya Orbit:
Molniya orbits are highly elliptical orbits with high inclinations, optimized for providing
extended coverage of high-latitude regions. They are commonly used in communication and
remote sensing satellites for observing polar regions.
Highly Elliptical Orbit (HEO):
HEOs have highly elliptical shapes with apogees far from the Earth and perigees relatively
close to the planet. They are utilized for specialized missions requiring long dwell times over
specific areas, such as communication or Earth observation.
ORBITAL PERTURBATIONS AND MANEUVERS
Orbital perturbations and maneuvers are essential considerations in remote sensing missions
to ensure the stability, accuracy, and efficiency of satellite orbits for data acquisition.
Perturbations are deviations from the ideal orbital path caused by gravitational, atmospheric,

Downloaded from EnggTree.com


EnggTree.com

and other factors. Maneuvers involve intentional adjustments to the satellite's orbit to
compensate for perturbations or achieve specific mission objectives.

www.EnggTree.com

Orbital Perturbations:
Gravitational Perturbations:
Gravitational forces from the Earth, Moon, and other celestial bodies cause variations in the
satellite's orbit, leading to perturbations. These perturbations can include changes in orbital
eccentricity, inclination, and nodal regression over time.
Atmospheric Drag:
Satellites in low Earth orbit (LEO) experience atmospheric drag, causing their orbits to decay
gradually. This drag results from interactions with the Earth's atmosphere, particularly at
lower altitudes, and requires periodic maneuvers to maintain the satellite's altitude and orbital
parameters.
Solar Radiation Pressure:
Solar radiation exerts pressure on the satellite's surface, causing small accelerations that
affect its orbit. Solar radiation pressure perturbations can cause deviations in the satellite's
position, leading to drift over time and requiring periodic corrections.

Downloaded from EnggTree.com


EnggTree.com

Geopotential Variations:
Variations in the Earth's gravitational field due to uneven mass distribution (e.g., mountains,
oceans, and density variations in the Earth's interior) induce perturbations in satellite orbits.
These variations can affect orbital elements such as inclination, eccentricity, and orbital
precession.
Orbital Maneuvers:
Orbit Raising or Lowering:
Satellites in LEO may perform orbit-raising maneuvers to counteract atmospheric drag and
maintain their altitude. Conversely, orbit-lowering maneuvers can be conducted to deorbit the
satellite at the end of its operational life or to transition to a lower orbit for mission
requirements.
Plane Change Maneuvers:
Plane change maneuvers involve adjusting the satellite's inclination to align its orbital plane
with a desired ground track or to synchronize with other satellites in a constellation. These
maneuvers are useful for optimizing coverage and revisits over specific regions of interest.
Station-Keeping Maneuvers:
Station-keeping maneuvers are performed to maintain a satellite's position relative to a
specific location on the Earth's surface or to other satellites in a constellation. These
maneuvers ensure consistent coverage and facilitate continuous monitoring of target areas.
www.EnggTree.com
Collision Avoidance Maneuvers:
Satellites may perform collision avoidance maneuvers to mitigate the risk of collisions with
other space objects, such as debris or operational satellites. These maneuvers involve
adjusting the satellite's orbit to avoid potential collisions and ensure mission safety.
Orbital Resonance Adjustment:
Satellites in certain orbits, such as those in resonance with the Earth's rotation or other
celestial bodies, may require periodic adjustments to maintain resonance conditions or
prevent destabilizing effects.

TYPES OF REMOTE SENSING PLATFORMS


Remote sensing platforms encompass a variety of vehicles or devices used to carry sensors
into the Earth's atmosphere or space to collect data about the Earth's surface and atmosphere.
The three main types of remote sensing platforms are
1. Ground-based
2. Airborne
3. Spaceborne platforms.

Downloaded from EnggTree.com


EnggTree.com

Ground-Based Platforms:
Ground-based remote sensing platforms are stationary or mobile platforms located on the
Earth's surface. They include:
Fixed Observatories: These are permanent installations equipped with various sensors and
instruments for continuous monitoring of specific locations. Examples include weather
stations, flux towers, and seismic stations.
Mobile Platforms: Mobile platforms such as vehicles, boats, or drones are equipped with
remote sensing instruments and can traverse different terrains to collect data over specific
areas of interest. Mobile platforms offer flexibility and versatility in data collection.
Terrestrial LiDAR: Terrestrial LiDAR systems are ground-based laser scanning devices used
to capture high-resolution 3D data of terrain, vegetation, buildings, and infrastructure. They
are often used for mapping, urban planning, and infrastructure management.
Ground-based platforms are advantageous for their relatively low cost, ease of deployment,
and ability to collect data at high spatial resolutions. However, their coverage is limited
compared to airborne and spaceborne platforms.

www.EnggTree.com

Airborne Platforms:
Airborne remote sensing platforms operate from aircraft, helicopters, or unmanned aerial
vehicles (UAVs) and provide an intermediate level of altitude between ground-based and
spaceborne platforms. They include:
Manned Aircraft: Manned aircraft equipped with remote sensing instruments fly at various
altitudes to capture data over large areas. They are used for aerial photography, multispectral
imaging, and LiDAR mapping.
Unmanned Aerial Vehicles (UAVs): UAVs, or drones, are increasingly utilized for remote
sensing applications due to their ability to collect high-resolution data at low altitudes with

Downloaded from EnggTree.com


EnggTree.com

flexibility and cost-effectiveness. UAVs are used in agriculture, environmental monitoring,


disaster assessment, and infrastructure inspection.
Airborne platforms offer advantages such as rapid deployment, high spatial resolution, and
the ability to access remote or hazardous areas. However, they are limited by their endurance
and operational altitude compared to spaceborne platforms.

Spaceborne Platforms:
Spaceborne remote sensing platforms operate from satellites orbiting the Earth and provide a
global perspective of the planet's surface and atmosphere. They include:
www.EnggTree.com
Earth Observation Satellites: Earth observation satellites are equipped with a variety of
sensors, including optical, thermal, radar, and multispectral instruments. They orbit the Earth
at different altitudes and inclinations to capture data for various applications, including
environmental monitoring, weather forecasting, land use mapping, and disaster management.
Spaceborne LiDAR: Spaceborne LiDAR systems mounted on satellites are used to measure
the elevation of the Earth's surface with high precision. They provide valuable data for
mapping terrain, monitoring glaciers, forests, and urban areas, and assessing topographic
changes.
Spaceborne platforms offer global coverage, long-term monitoring capabilities, and access to
remote or inaccessible regions. However, they require significant investment in launch and
satellite development and have limitations in spatial resolution compared to airborne
platforms.

Downloaded from EnggTree.com


EnggTree.com

CLASSIFICATION OF SATELLITES
Satellites can be classified based on various criteria, including their orbits, missions, and
applications. There are two types,
1. Sun-synchronous satellites
2. Geostationary satellites. www.EnggTree.com
Sun-Synchronous Satellites:
It is also known as polar orbiting satellites, orbit the Earth in a near-polar orbit while
maintaining a consistent angle relative to the Sun. This characteristic ensures that the satellite
passes over any given point on the Earth's surface at roughly the same local solar time during
each orbit. Sun-synchronous satellites typically have the following characteristics:
Orbit: Sun-synchronous satellites typically orbit the Earth in a near-polar, low Earth orbit
(LEO) at altitudes ranging from a few hundred to a few thousand kilometers. These orbits are
inclined at an angle relative to the equator, allowing the satellite to cover different latitudes
with each orbit while maintaining a consistent solar angle.
Advantages: Sun-synchronous satellites offer several advantages for remote sensing and
Earth observation missions:
Consistent Lighting Conditions: By maintaining a consistent angle relative to the Sun, Sun-
synchronous satellites ensure uniform lighting conditions during each pass over the Earth's
surface. This consistency is critical for applications such as vegetation monitoring, land cover
mapping, and change detection.
Seasonal Coverage: Sun-synchronous orbits allow satellites to cover the entire globe over the
course of several days or weeks, providing comprehensive seasonal coverage of the Earth's
surface.

Downloaded from EnggTree.com


EnggTree.com

Repeat Pass Capability: Sun-synchronous satellites have a repeatable ground track, enabling
them to revisit the same locations on the Earth's surface at regular intervals. This capability is
valuable for monitoring changes over time and detecting trends in environmental phenomena.
Applications: Sun-synchronous satellites are used for a wide range of applications, including
environmental monitoring, climate studies, land use mapping, agriculture, forestry, disaster
management, and scientific research.

www.EnggTree.com

Geostationary Satellites:
Geostationary satellites, also known as geosynchronous equatorial orbit (GEO) satellites,
orbit the Earth directly above the equator at a fixed position relative to the Earth's surface.
These satellites orbit the Earth at the same rate as the Earth's rotation, resulting in a stationary
position relative to a specific point on the Earth's surface. Geostationary satellites typically
have the following characteristics:
Orbit: Geostationary satellites orbit the Earth at an altitude of approximately 35,786
kilometers above the equator. They orbit the Earth in the same direction and at the same rate
as the Earth's rotation, completing one orbit approximately every 24 hours.
Advantages: Geostationary satellites offer several advantages for communications, weather
monitoring, and other applications:

Downloaded from EnggTree.com


EnggTree.com

Continuous Coverage: Geostationary satellites provide continuous coverage of a fixed area


on the Earth's surface, making them ideal for applications that require real-time monitoring,
such as weather forecasting, telecommunications, and disaster management.
High-Elevation Angles: Geostationary satellites provide high-elevation angles, allowing them
to capture wide-area images of the Earth's surface with minimal distortion.
Long-Term Monitoring: Geostationary satellites can monitor changes in weather patterns,
cloud cover, and environmental conditions over extended periods, facilitating long-term
climate studies and trend analysis.
Applications: Geostationary satellites are primarily used for weather monitoring,
telecommunications, broadcast television, navigation, and environmental monitoring.

www.EnggTree.com

LEGRANGE ORBIT
The Lagrange points, also known as libration points or Lagrangian points, are positions in
space where the gravitational forces of two large bodies, such as the Earth and the Moon or
the Earth and the Sun, balance the centripetal force felt by a smaller object. There are five
Lagrange points labeled L1 through L5. While Lagrange points are not typically used for
remote sensing satellites, they can be advantageous for certain specialized missions due to
their unique orbital characteristics. Let's explore how Lagrange points could potentially be
utilized for remote sensing:
Lagrange Point 1 (L1):
L1 is located between the Earth and the Sun, directly along the line connecting their centers.
At this point, the gravitational forces of the Earth and the Sun balance out, allowing a satellite
to maintain a relatively stable position with respect to both bodies.
Advantages for Remote Sensing: A satellite positioned at L1 could provide continuous solar
observation, monitoring space weather phenomena, and providing early warning of solar
storms, which can impact satellite communications and power grids on Earth.

Downloaded from EnggTree.com


EnggTree.com

Lagrange Point 2 (L2):


L2 is located on the opposite side of the Earth from the Sun, approximately 1.5 million
kilometers away from Earth. Like L1, the gravitational forces of the Earth and the Sun
balance out at this point, providing a stable orbit for satellites.
Advantages for Remote Sensing: Satellites positioned at L2 could offer uninterrupted views
of the dark side of the Moon and provide valuable data for lunar exploration missions.
Additionally, L2 could serve as a platform for space-based observatories, offering a vantage
point for astronomical observations away from the interference of Earth's atmosphere.
Lagrange Point 5 (L5):
L5 is located approximately 60 degrees ahead of or behind the Earth in its orbit around the
Sun, forming an equilateral triangle with the Earth and the Sun. At this point, the
gravitational forces of the Earth and the Sun, along with the centrifugal force of their
combined motion, provide a stable location for satellites.
Advantages for Remote Sensing: Satellites positioned at L5 could offer a unique perspective
for Earth observation, potentially providing continuous monitoring of specific regions or
phenomena, such as weather patterns, climate change, or environmental phenomena.

www.EnggTree.com

Downloaded from EnggTree.com


EnggTree.com

UNIT IV- SENSING TECHNIQUES


CLASSIFICATION OF REMOTE SENSORS
Remote sensors can be classified in various ways based on their characteristics,
operating principles, and applications.
1. Based on Energy Source
2. Based on the Spectrum of Measurement
3. Based on Platform
4. Based on Spatial Resolution
5. Based on Application
Based on Energy Source:
Passive Sensors: These sensors measure natural energy (e.g., sunlight) reflected or emitted
by objects in the Earth's surface or atmosphere. Examples include optical sensors (visible,
infrared) and thermal sensors.
Active Sensors: These sensors emit energy (e.g., microwaves, lasers) and measure the energy
reflected or backscattered by objects. Examples include RADAR (Radio Detection and
Ranging) and LIDAR (Light Detection and Ranging) sensors.

www.EnggTree.com

Based on the Spectrum of Measurement:


Visible Spectrum Sensors: Capture electromagnetic radiation within the visible range
(approximately 400 to 700 nanometers) and are commonly used for color imaging.
Infrared Sensors: Capture electromagnetic radiation beyond the visible spectrum, including
near-infrared (NIR), short-wave infrared (SWIR), mid-wave infrared (MWIR), and thermal
infrared (TIR). These sensors are useful for applications such as vegetation analysis, soil
moisture assessment, and thermal mapping.
Microwave Sensors: Operate in the microwave portion of the electromagnetic spectrum and
are particularly suitable for applications requiring penetration through clouds, vegetation, and
soil. They are commonly used for radar imaging and sensing.

Downloaded from EnggTree.com


EnggTree.com

Based on Platform:
Satellite Sensors: Mounted on satellites orbiting the Earth, these sensors provide global
coverage and are used for various applications such as environmental monitoring, weather
forecasting, and disaster management.
www.EnggTree.com
Aerial Sensors: Mounted on aircraft or drones, these sensors provide high-resolution
imagery and are suitable for localized and rapid data collection over specific areas.
Ground-Based Sensors: Fixed or mobile sensors deployed on the ground, which are used for
specific applications such as weather monitoring, traffic monitoring, and environmental
research.

Downloaded from EnggTree.com


EnggTree.com

Based on Spatial Resolution:


High-Resolution Sensors: Provide detailed imagery with fine spatial resolution, suitable for
applications requiring detailed mapping and analysis.
Medium-Resolution Sensors: Offer moderate levels of detail, suitable for regional mapping
and land cover classification.
Low-Resolution Sensors: Provide broader coverage with lower detail, suitable for global-
scale studies and monitoring.

Based on Application:

www.EnggTree.com
Environmental Monitoring: Sensors used for assessing and monitoring environmental
parameters such as land cover, vegetation health, water quality, and air pollution.
Weather and Climate Monitoring: Sensors used for measuring meteorological parameters
such as temperature, humidity, precipitation, and atmospheric composition.
Defense and Security: Sensors used for surveillance, reconnaissance, and intelligence
gathering in defense and security applications.
Agriculture and Forestry: Sensors used for monitoring crop health, estimating yields,
assessing forest resources, and detecting forest fires.

Downloaded from EnggTree.com


EnggTree.com

RESOLUTION CONCEPT
The concept of resolution in remote sensing refers to the ability of a sensor to
distinguish between objects or features in the Earth's surface or atmosphere. It is a critical
aspect that determines the level of detail present in the imagery or data collected by the
sensor. Resolution can be classified into several types
1. Spatial Resolution
2. Spectral Resolution
3. Temporal Resolution
4. Radiometric Resolution
Spatial Resolution:
 Spatial resolution refers to the size of the smallest discernible or resolvable feature in
the imagery. For optical sensors, it is typically measured in terms of meters per pixel
or centimeters per pixel on the ground.
 Higher spatial resolution means smaller pixel sizes and greater detail in the imagery,
allowing for the identification of smaller objects or features.
Spatial resolution is influenced by factors such as the sensor's spatial sampling capabilities,
altitude, and optics.
Spectral Resolution:
 Spectral resolution refers to the ability of a sensor to distinguish between different
wavelengths or bands of www.EnggTree.com
electromagnetic radiation.
 It is determined by the number and width of the spectral bands captured by the sensor.
 Sensors with higher spectral resolution can discriminate between a greater number of
spectral features, enabling more detailed analysis of surface properties such as
vegetation health, mineral composition, and land cover types.

Temporal Resolution:
 Temporal resolution refers to the frequency at which a sensor revisits or acquires data
over a particular area.
 It is measured in terms of the time interval between successive observations.

Downloaded from EnggTree.com


EnggTree.com

 Sensors with higher temporal resolution provide more frequent updates of the Earth's
surface, allowing for monitoring of dynamic processes such as land cover changes,
crop growth cycles, and natural disasters.

Radiometric Resolution:
 Radiometric resolution refers to the sensor's ability to detect and record variations in
the intensity or brightness of electromagnetic radiation.
 It is determined by the number of bits used to represent the digital values of the
www.EnggTree.com
recorded data.
 Higher radiometric resolution enables the sensor to capture subtle differences in
reflectance or emission, leading to greater sensitivity and accuracy in quantitative
analysis.

SCANNERS
In remote sensing, scanners can be classified based on the direction in which they acquire
image data relative to the platform's movement. Two common types of scanners are
1. Along-track scanners
2. Across-track scanners
Along-Track Scanners:
 Along-track scanners, also known as "pushbroom" scanners, acquire image data in the
direction of the platform's movement.
 These scanners use linear or two-dimensional arrays of detectors to capture a
continuous swath of data perpendicular to the platform's flight path.
 As the platform moves forward, the detectors collect data continuously along the
track, producing an image composed of adjacent scan lines.
 Examples of platforms equipped with along-track scanners include most satellite
sensors, where the satellite moves along its orbital path while scanning the Earth's
surface below.

Downloaded from EnggTree.com


EnggTree.com

Across-Track Scanners:
 Across-track scanners, also known as "whiskbroom" scanners, acquire image data
across the platform's track, perpendicular to the direction of movement.
 These scanners typically use a single or multiple detectors that scan across the swath
www.EnggTree.com
of interest as the platform moves forward.
 The detectors collect data along individual scan lines across the swath, and the
platform may need to make multiple passes to cover the entire area of interest.
 Examples of platforms equipped with across-track scanners include some airborne
sensors and ground-based systems.

Key Differences:

Downloaded from EnggTree.com


EnggTree.com

Spatial Coverage:
 Along-track scanners cover a continuous swath perpendicular to the platform's path,
providing a wider spatial coverage in a single pass.
 Across-track scanners cover a swath across the platform's path, requiring multiple
passes or scans to achieve the same spatial coverage as along-track scanners.
Image Formation:
 Along-track scanners produce images composed of adjacent scan lines collected
continuously along the platform's track.
 Across-track scanners produce images composed of scan lines collected across the
swath width, typically with gaps between adjacent lines that may need to be stitched
together.
Applications:
 Along-track scanners are well-suited for satellite-based remote sensing applications,
where wide-area coverage is essential.
 Across-track scanners are commonly used in airborne remote sensing applications,
where high spatial resolution and detailed imaging of smaller areas are required.

OPTICAL SENSORS
Principle: Optical sensors operate in the visible, near-infrared (NIR), and short-wave
infrared (SWIR) regions of the electromagnetic spectrum. They detect the reflected sunlight
www.EnggTree.com
from the Earth's surface. Different materials reflect and absorb light differently, allowing
optical sensors to discern various features on the ground.

Applications: Optical sensors are widely used in land cover classification, vegetation
monitoring, urban planning, agriculture, and environmental studies.
Calibration: Calibration of optical sensors involves correcting for radiometric and geometric
distortions in the imagery. Radiometric calibration ensures that pixel values represent
accurate reflectance values. Geometric calibration corrects for distortions such as terrain
relief, sensor tilt, and Earth curvature.

Downloaded from EnggTree.com


EnggTree.com

INFRARED SENSORS
Principle: Infrared sensors operate in the infrared portion of the electromagnetic spectrum,
beyond the visible range. They detect thermal radiation emitted by objects. Infrared sensors
can be further divided into near-infrared (NIR), short-wave infrared (SWIR), mid-wave
infrared (MWIR), and thermal infrared (TIR) sensors, each sensitive to different wavelengths.

www.EnggTree.com
Applications: Infrared sensors are used for applications such as vegetation health
assessment, soil moisture estimation, mineral identification, and heat mapping.
Calibration: Calibration of infrared sensors involves correcting for sensor noise,
atmospheric effects, and temperature variations. Radiometric calibration ensures that pixel
values accurately represent thermal radiance or temperature values.

THERMAL SENSORS
Principle: Thermal sensors operate specifically in the thermal infrared (TIR) region of the
electromagnetic spectrum, detecting the thermal radiation emitted by objects. They measure
the temperature of objects or surfaces based on their thermal emissions.

Downloaded from EnggTree.com


EnggTree.com

Applications: Thermal sensors are used for applications such as monitoring land surface
temperature, detecting heat anomalies, assessing thermal properties of buildings, and
identifying thermal signatures of vegetation stress or fires.
Calibration: Calibration of thermal sensors involves ensuring accurate temperature
measurements by calibrating the sensor's response to known temperature references.
Corrections are made for sensor drift, non-uniformity, and atmospheric effects.

MICROWAVE SENSORS
Principle: Microwave sensors operate in the microwave portion of the electromagnetic
spectrum. They emit microwave pulses and measure the backscattered radiation reflected
from the Earth's surface. Microwave sensors can penetrate clouds, vegetation, and soil,
making them useful for all-weather and day-night imaging.

www.EnggTree.com

Applications: Microwave sensors are used for applications such as terrain mapping, soil
moisture estimation, sea surface monitoring, ice detection, and agricultural monitoring.
Calibration: Calibration of microwave sensors involves correcting for system noise, antenna
patterns, and atmospheric effects. Radiometric calibration ensures accurate measurements of
backscattered microwave signals.

CALIBRATION OF SENSORS
Calibration of sensors in remote sensing is a critical process to ensure that the data collected
by the sensors are accurate, reliable, and consistent. Calibration involves a series of steps to
correct for various sources of error and uncertainty in the sensor measurements.
1. Radiometric Calibration
2. Geometric Calibration
3. Temporal Calibration
4. Cross-Track Calibration
5. In-Flight Calibration
Radiometric Calibration:
Purpose: Radiometric calibration ensures that the digital numbers (DN) or sensor readings
recorded by the sensor accurately represent the radiance or reflectance of the objects being
observed.

Downloaded from EnggTree.com


EnggTree.com

Steps:
 Response Calibration: The sensor's response to known radiance or reflectance
standards is measured and used to establish a calibration curve relating sensor
readings to physical units (e.g., watts per square meter per steradian).
 Correction for Systematic Errors: Corrections are applied to compensate for sensor-
specific errors such as dark current, sensor gain variations, non-linearity, and stray
light.
 Atmospheric Correction: Corrections are made to account for atmospheric effects
such as scattering, absorption, and path radiance, which can affect the observed
radiance values.
Geometric Calibration:
Purpose: Geometric calibration ensures that the spatial relationships between objects in the
imagery are accurately represented, correcting for distortions introduced by the sensor and
platform.
Steps:
 Sensor Model Calibration: Mathematical models are used to characterize the sensor's
geometric properties, including its focal length, lens distortion, and sensor orientation.
 Ground Control Points (GCPs): GCPs with known coordinates on the Earth's surface
are identified in the imagery and used to estimate and correct geometric distortions
such as scale, rotation, and translation.
 www.EnggTree.com
Orthorectification: Orthorectification is performed to project the image pixels onto a
map coordinate system, correcting for terrain relief and platform tilt effects.
Temporal Calibration:
Purpose: Temporal calibration ensures temporal consistency and continuity in the sensor
data over time, allowing for meaningful comparisons and analysis of multi-temporal datasets.
Steps:
 Inter-Sensor Calibration: If data are collected from multiple sensors or platforms,
calibration procedures are performed to ensure consistency and compatibility between
datasets.
 Radiometric Normalization: Datasets acquired at different times may exhibit
variations in radiometric properties due to changes in atmospheric conditions, solar
angle, and sensor characteristics. Radiometric normalization techniques are applied to
standardize the data to a common radiometric scale.
Cross-Track Calibration:
Purpose: Cross-track calibration ensures uniformity and consistency in image quality across
the entire swath width of the sensor.
Steps:
 Detector Response Calibration: Detector response variations across the sensor's field
of view are measured and corrected to ensure uniform sensitivity and accuracy.

Downloaded from EnggTree.com


EnggTree.com

 Stray Light Correction: Stray light from adjacent pixels or off-nadir angles can
contaminate the signal, leading to inaccuracies in the image. Corrections are applied
to minimize stray light effects.
In-Flight Calibration:
Purpose: In-flight calibration involves periodic measurements and adjustments made during
sensor operation to monitor and maintain sensor performance over time.
Steps:
 Onboard Calibration Targets: Some sensors are equipped with onboard calibration
targets or instruments to monitor sensor stability and performance.
 Regular Monitoring: Sensor parameters such as signal-to-noise ratio, dynamic range,
and stability are monitored and recorded during routine operations. Adjustments and
recalibrations are made as needed to ensure data quality.

HIGH RESOLUTION SENSORS


Principle: High-resolution sensors capture imagery with finer spatial detail compared to
standard-resolution sensors. They may utilize various technologies such as along-track or
across-track scanning, multiple spectral bands, and advanced optics to achieve high spatial
resolution.

www.EnggTree.com

Applications: High-resolution sensors are used for detailed mapping, urban planning,
infrastructure monitoring, disaster assessment, and other applications requiring fine spatial
detail.
Calibration: Calibration of high-resolution sensors involves ensuring accuracy in spatial and
radiometric measurements. Geometric calibration corrects for distortions in the imagery,
while radiometric calibration ensures accurate representation of pixel values.

Downloaded from EnggTree.com


EnggTree.com

LIDAR, UAV ORBITAL AND SENSOR CHARACTERISTICS OF LIVE


INDIAN EARTH OBSERVATION SATELLITES
India has several operational Earth observation satellites that provide data for various
applications, including agriculture, forestry, disaster management, urban planning, and
environmental monitoring. Here are some characteristics of a few prominent Indian Earth
observation satellites along with information on LIDAR and UAV platforms

www.EnggTree.com

LIDAR:
 India has utilized LIDAR technology for various applications, including topographic
mapping, forest canopy analysis, urban planning, and infrastructure monitoring.
 The Indian Space Research Organisation (ISRO) has developed LIDAR payloads for
some of its satellites, such as the Terrain Mapping Camera (TMC) onboard the
Chandrayaan-1 lunar mission and the Chandrayaan-2 mission, which included the
Terrain Mapping Camera-2 (TMC-2) for lunar surface topography mapping.
UAV (Unmanned Aerial Vehicle):
 UAVs are increasingly being used for remote sensing applications in India,
particularly for high-resolution imaging, agricultural monitoring, disaster assessment,
and infrastructure inspection.
 Indian institutions and organizations, including ISRO, the Indian Institute of Remote
Sensing (IIRS), and various research institutes and universities, have been involved in
the development and deployment of UAVs for remote sensing purposes.
 UAV platforms equipped with multispectral or hyperspectral sensors are utilized for
crop monitoring, land cover mapping, forest health assessment, and environmental
monitoring.
Orbital Earth Observation Satellites:

Downloaded from EnggTree.com


EnggTree.com

 ResourceSat series: The ResourceSat series comprises multispectral Earth observation


satellites developed by ISRO. These satellites carry payloads such as the Linear
Imaging Self-Scanning Sensor (LISS) and Advanced Wide Field Sensor (AWiFS) for
medium-resolution imaging.
 Cartosat series: The Cartosat series includes high-resolution Earth observation
satellites designed for cartographic applications, urban planning, and infrastructure
development. These satellites carry panchromatic and multispectral cameras capable
of capturing imagery with sub-meter to sub-half-meter resolution.
 RISAT series: The Radar Imaging Satellite (RISAT) series consists of synthetic
aperture radar (SAR) satellites for all-weather, day-and-night imaging. These
satellites provide high-resolution radar imagery for applications such as agriculture,
forestry, soil moisture estimation, and disaster management.
 EOS series: The EOS (Earth Observation Satellite) series includes satellites equipped
with optical and microwave sensors for various remote sensing applications. These
satellites carry payloads such as optical cameras, hyperspectral sensors, and
microwave radiometers for imaging and data collection.
Sensor Characteristics:
 Indian Earth observation satellites are equipped with a range of sensors, including
optical imagers, SAR instruments, and specialized payloads for specific applications.
 Optical sensors typically capture imagery in the visible, near-infrared, and short-wave
infrared spectral bands, enabling the detection and characterization of surface features
such as vegetation, water bodies, and urban areas.
 SAR sensors operate in thewww.EnggTree.com
microwave portion of the electromagnetic spectrum and
provide all-weather, day-and-night imaging capabilities with high spatial resolution.
SAR imagery is particularly useful for applications requiring penetration through
clouds and vegetation cover, such as flood monitoring, crop mapping, and forest
inventory.

Downloaded from EnggTree.com


EnggTree.com

www.EnggTree.com

Downloaded from EnggTree.com


EnggTree.com

UNIT V-DATA PRODUCTS AND


INTERPRETATION

PHOTOGRAPHIC AND DIGITAL PRODUCTS


Introduction
 Remote sensing involves the collection and interpretation of
data about the Earth's surface without direct contact.
 Both photographic and digital technologies play vital roles in
capturing and analyzing remote sensing data.
 Traditional methods relied on photographic film, while modern
approaches use digital sensors. The remote sensing data
products are available to the users in the form of
1. photographic products such as paper prints, film negatives,
dispositive of black and white,and false colour composite (FCC)
on a variety of scales
2. digital form as computer compatible tape (CCTs) after necessary
www.EnggTree.com
corrections.
 Broadly, satellite data products can be classified into different
types based on satellite and sensor, level of preprocessing and
the media.
 Data products acquired for the specific period can be generated
if the data pertaining to the period of interest is available in
archives.
 Depending upon the corrections applied and on the level of
processing, data products can be classified as :raw data, partially
corrected products, standard products, geocoded products, and
precision products.
 The raw data is radiometrically and geometrically uncorrected
data with ancillary information (stereo products for
photogrammetric studies). Standard products are radiometrically
and geometrically corrected for systematic errors. Geocoded
products are systematically and geometrically corrected
products. The systematic corrections are based on the standard

Downloaded from EnggTree.com


EnggTree.com

survey of India toposheet and rotation of pixels to align to true


north and resampled to standard square pixel.

Figure Types of product media (NRSA, 1999)


www.EnggTree.com
 Precision products are radiometric and geometric corrections
refined with the use of ground control points to achieve greater
locational accuracy.
 Data products can be broadly classified into two types
depending upon the output media, as photographic and digital.
 The Figure shows the types of products based on media.
Photographic products can either be in black and white, or
colour. Further they could be either film or paper products, and
in films it is possible to have either positive film or negative
film.
 The sizes of photographic products can vary depending on the
enlargement needed, and this is specified as 1X, 2X, 4X and so
on. The size of film recorders is generally 240 mm and this is
the basic master output from which further products are
generated. When we say colour photographic products, it
generally means false colour composites (plate3).

Downloaded from EnggTree.com


EnggTree.com

 FCCs are generated by combining the data contained in 3


different spectral bands into one image by assigning blue, green
and red colours to the data in three spectral bands respectively
during the exposure of a colour negative.
 The choice of band combinations can be determined depending
upon the application on hand.
 Different types of photographic products supplied by National
Remote Sensing Agency (NRSA) data centre, Govt., of India
(NDC) are: Standard B/W and FCC, films. Standard products
are available in colour, and black and white in the form of 240
mm films, either as negatives or positives. Figure shows the
various photographic products of different sizes and different
media of printing. Paper prints both Band Wand FCC are
supplied in various scales.
 They are 1 X (contact prints) , 2X (two times enlarged) and 4X
(four times enlarged) and 5X (5 times enlarged). Depending
upon the enlargement the scale of the product varies (IRS
handbook, 1998). www.EnggTree.com
 The photographic products contain certain details annotated on
the margins. These are useful for identifying the scene, sensor,
date of pass, processing level, band combination, and so on .
 Basically, the visual interpretation of the remote sensing data is
based on the False Colour Composites (FCCs). Even after the
digital techniques, the results are visually interpreted.
 Scientists, analysts and other users may interpret the same scene
for different purposes. In fact it is one of the rare sources of
information which can generate multiple themes, such as , water
resources, soil, land use, and urban sprawl.
Figure Photographic and digital

Downloaded from EnggTree.com


EnggTree.com

www.EnggTree.com

TYPES, LEVELS AND OPEN SOURCE SATELLITE DATA


PRODUCTS
Types of Satellite Data:
1. Optical Imagery
2. Radar Imagery
3. Hyperspectral Imagery
4. LiDAR Data
5. Thermal Imagery
Optical Imagery:
Captured by satellites equipped with optical sensors, these images are
similar to what the human eye perceives. They provide information in
various spectral bands, including visible, near-infrared, and thermal
infrared, allowing for the analysis of land cover, vegetation health,
urban development, and more.

Downloaded from EnggTree.com


EnggTree.com

Figure optical imagery


Radar Imagery:
Synthetic Aperture Radar (SAR) satellites emit microwave signals
and measure the return signal to create images. SAR data is useful for
all-weather and day-night imaging, terrain mapping, monitoring
changes in Earth's surface, and detecting objects such as ships and oil
spills.

www.EnggTree.com

Figure radar image


Hyperspectral Imagery:
These images capture information in hundreds of narrow spectral
bands, providing detailed spectral signatures for materials on the
Earth's surface. Hyperspectral data is valuable for tasks like mineral
identification, environmental monitoring, and precision agriculture.

Downloaded from EnggTree.com


EnggTree.com

Figure hyperspectral imagery


Thermal Imagery:
Sensors aboard satellites measure thermal infrared radiation emitted
by the Earth's surface. Thermal imagery is used for detecting heat
anomalies, monitoring volcanic activity, assessing urban heat islands,
and studying climate change impacts.

www.EnggTree.com

Figure thermal imagery

LiDAR Data:

Downloaded from EnggTree.com


EnggTree.com

LiDAR sensors emit laser pulses and measure the time it takes for the
pulses to return, providing highly accurate elevation data. LiDAR is
essential for creating Digital Elevation Models (DEMs), assessing
terrain characteristics, and mapping landforms and vegetation
structure.

Figure lidar imagery


www.EnggTree.com
Levels of Satellite Data:
1. Level-0
2. Level-1
3. Level-2
4. Level-3
5. Level-4

Figure levels of satellite data

Downloaded from EnggTree.com


EnggTree.com

Level-0:
Raw data as received from the satellite without any processing.
Level-1:
Data processed to correct for sensor artifacts, geometric distortions,
and radiometric calibrations, making it usable for further analysis.
Level-2:
Further processed data with atmospheric correction applied to remove
atmospheric effects, enhancing the accuracy of quantitative analysis.
Level-3:
Data that is georeferenced and often aggregated over time or space to
create global or regional datasets suitable for thematic mapping and
trend analysis.
Level-4:
Derived products generated by combining satellite data with other
www.EnggTree.com
datasets or models to produce value-added products such as
vegetation indices, land cover maps, and climate variables.
Open Source Satellite Data Products:
1. Sentinel Data
2. Landsat Data
3. MODIS Data
4. ESA Earth Observation Data
5. NASA Earth Observing System Data
Sentinel Data:
The European Space Agency's Sentinel satellites offer free and open
access to a wealth of optical, radar, and thermal data through the
Copernicus Open Access Hub. Sentinel data is widely used for
environmental monitoring, disaster management, and scientific
research.

Downloaded from EnggTree.com


EnggTree.com

Figure Sentinel Data


Landsat Data:
www.EnggTree.com
The Landsat program, jointly managed by NASA and the USGS,
provides the longest continuous record of Earth observations from
space. Landsat data, including Landsat 8 and Landsat 9, is freely
available and widely used for monitoring land cover change,
assessing ecosystem health, and managing natural resources.

Downloaded from EnggTree.com


EnggTree.com

Figure Landsat Data

MODIS Data:
www.EnggTree.com
The Moderate Resolution Imaging Spectroradiometer (MODIS)
aboard NASA's Terra and Aqua satellites provides global coverage
with moderate spatial resolution and daily revisits. MODIS data is
used for monitoring vegetation dynamics, fire activity, sea surface
temperature, and atmospheric conditions.

Figure MODIS Data

Downloaded from EnggTree.com


EnggTree.com

ESA Earth Observation Data:


In addition to Sentinel data, the European Space Agency (ESA) offers
access to other Earth observation missions such as the Envisat, ERS,
and CryoSat satellites, providing a diverse range of data products for
scientific and operational applications.

Figure ESA Earth Observation Data


www.EnggTree.com
NASA Earth Observing System Data:
NASA's Earth Observing System (EOS) satellites, including Terra,
Aqua, and Aura, provide a wealth of data on Earth's atmosphere,
oceans, land surfaces, and biosphere. These data are freely available
through NASA's Earthdata Search and Distributed Active Archive
Centers (DAACs).

Downloaded from EnggTree.com


EnggTree.com

Figure NASA Earth Observing System Data


SELECTION AND PROCUREMENT OF DATA
1. Define Objectives and Requirements
2. Research Availablewww.EnggTree.com
Data Sources
3. Assess Data Quality and Suitability
4. Select Appropriate Data Products
5. Acquire Data
6. Ensure Data Compatibility and Preprocessing
7. Document and Validate Data
Define Objectives and Requirements:
 Clearly define the objectives of your remote sensing project or
analysis. Determine what information you need to achieve your
goals.
 Identify the spatial and temporal resolutions required for your
study area and time frame.
 Consider the spectral bands or data types necessary to address
your research questions or applications.
Research Available Data Sources:

Downloaded from EnggTree.com


EnggTree.com

 Explore the various sources of remote sensing data, including


satellite missions, aerial surveys, government agencies, research
institutions, and commercial providers.
 Familiarize yourself with the characteristics, capabilities, and
limitations of different sensors and platforms.
 Investigate open-source data repositories and archives that offer
free or low-cost access to satellite imagery and other remote
sensing datasets.
Assess Data Quality and Suitability:
 Evaluate the quality, accuracy, and reliability of available
datasets. Consider factors such as radiometric and geometric
calibration, sensor resolution, and spectral characteristics.
 Assess whether the spatial, spectral, and temporal resolutions
meet your requirements for the intended application.
 Verify the data's currency and relevance to your study area and
research objectives.
www.EnggTree.com
Select Appropriate Data Products:
 Choose the remote sensing data products that best match your
project's needs and specifications.
 Select datasets that provide the required spatial coverage,
spectral information, and temporal frequency for your analysis.
 Consider complementary datasets or multi-sensor/multi-
temporal approaches to enhance the robustness and accuracy of
your results.
Acquire Data:
 Once you've identified the desired datasets, proceed to acquire
the data through appropriate channels.
 Utilize online portals, data archives, and distribution platforms
provided by satellite agencies, government organizations, and
data providers.
 Depending on the availability and licensing terms, download or
request access to the required datasets.

Downloaded from EnggTree.com


EnggTree.com

Ensure Data Compatibility and Preprocessing:


 Verify that the acquired data are compatible with your analysis
software and workflow.
 Perform necessary preprocessing steps, such as geometric
correction, radiometric calibration, and atmospheric correction,
to enhance the usability and accuracy of the data.
 Address any potential issues or artifacts that may affect the
interpretation or analysis of the remote sensing data.

Document and Validate Data:


 Document the metadata and provenance of the acquired datasets,
including sensor specifications, acquisition parameters, and
processing history.
 Validate the accuracy and reliability of the data through ground
truth measurements, validation studies, or comparison with
reference datasets.
www.EnggTree.com
 Document any uncertainties or limitations associated with the
remote sensing data to ensure transparency and rigor in your
analysis.
VISUAL INTERPRETATION- BASIC ELEMENTS AND
INTERPRETATION KEYS
Basic elements of image interpretation
A systematic study .of aerial photographs and satellite imageries
usually involves several characteristics of features shown on an
image. The following characteristics (elements) are called
fundamental picture elements. These elements aid visual
interpretation process of aerial photos and/or satellite imagery.

Downloaded from EnggTree.com


EnggTree.com

Figure elements of image interpretation

(i) Tone
 Ground objects of different colour reflect the incident radiation
differently depending upon the incident wave length, physical
and chemical constituents of the objects.
www.EnggTree.com
 The imagery as recorded in remote sensing is in different shades
or tones. For example, ploughed and cultivated lands record
differently from fallow fields. Tone is expressed qualitatively as
light, medium and dark.
 In SLAR imagery, for example, the shadows cast by non-return
of the microwaves appear darker than those parts where greater
reflection takes place. These parts appear of lighter tone.
 Similarly in thermal imagery objects at higher temperature are
recorded of lighter tone compared to objects at lower
temperature, which appear of medium to darker tone. Similarly
top soil appears as of dark tone compared to soil containing
quartz sand.
 The coniferous trees appear in lighter tone compared to broad
leave tree clumps.
 Tone, therefore, refers to the colour or reflective brightness.
Tone along with texture and shadow (as described below) help
in Interpretation and hence is a very important key.

Downloaded from EnggTree.com


EnggTree.com

 Differences in moisture content of the soil or rock result in


differences in tone. In a black and white photograph dark tone
indicates dark bodies, namely, greater moisture contents and
grey or white tone reflect the dry soil.
 The aerial photos with good contrast bring out tonal differences
and hence help in better interpretation. Tonal contrast can be
enhanced by use of high contrast film, high contrast paper or by
specialized image processing techniques such as 'Dodging' or
'Digital Enhancement'.
 Sometimes Infrared film can give better contrast but it can also
reduce resolution and loss of detail in shadows.
(ii) Texture
 Texture is an expression of roughness or smoothness as
exhibited by the imagery. It is the rate of change of tonal values.
 Mathematically it is given as dD/dx where D is the Density and
'x' the distance measured from one arbitrary starting point, and
can be measured numerically by the use of microdensitometer.
 Changes of density 'D' from point 'A' of the imagery to point 'B'
as measured by thewww.EnggTree.com
micro-densitometer divided by the distance
gives the texture values numerically. Texture is dependent upon.
(a) photographic tone
(b) shape,
(c) size,
(d) pattern and scale of the imagery.
 Any slight variation of these can change the texture. Texture can
qualitatively be expressed as course, medium and fine. The
texture is a combination of several image characteristics such as
tone, shadow, size, shape and pattern etc., and is produced by a
mixture of features too small to be seen individually because the
texture by definition is the frequency of tonal changes.
 As an example, leaves of a tree are too small to be seen on an
aerial photo collectively along with shadow they give what is
called texture, which in turn helps to differentiate between
shrubs and trees. Texture sometimes can be very important
factor in determining the slope stability.
 In the case of a humid ground, the blockage of water or bad
drainage a characteristic texture results.

Downloaded from EnggTree.com


EnggTree.com

 Even spring and seepage of water from the base of clay give a
kind of 'turbulant' texture.
(iii) Association
 The relation of a particular feature to its surroundings is an
important key to interpretation.
 Sometimes a single feature by itself may not be distinctive
enough to permit its identification.
 For example, Sink holes appears as dark spots on an imagery
where the surface or immediate subsurface soil consists of lime
stones, Thus the appearance of sink holes is always associated
with surface lime stone formation.
 An example is that of kettle holes which appear as depressions
on photos due to terminal moraine and glacial terrain.
 An another example is that of dark-toned features associated
with a flood plain of a river, which can be interpreted as infilled
oxbow lakes.
(iv) Shape
Some ground features have typical shapes due to the structure or
topography. For examplewww.EnggTree.com
air fields and football stadium easily can be
interpreted because of their finite ground shapes and geometry
whereas volcanic covers, sand, river terraces, cliffs, gullies can be
identified because of their characteristics shape controlled by geology
and topography.
(v) Size
 The size of an image also helps for its identification whether it is
relative or absolute.
 Sometimes the measurements of height (as by using parallax
bar) also gives clues to the nature of the object.
 For example, measurement of height of different clumps of trees
gives an idea of the different species, similarly the measurement
of dip and strike of rock formation help in identifying
sedimentary formation.
 Similarly the measurements of width of roads help in
discriminating roads of different categories i.e,national, state,
local etc. Size of course, is dependent upon the scale of imagery.
(vi) Shadows

Downloaded from EnggTree.com


EnggTree.com

 Shadows cast by objects are sometimes important clues to their


identification and
 Interpretation.
 For example, shadow of a suspension bridge can easily be
discriminated from that of cantilever bridge.
 Similarly circular shadows are indicative of coniferous trees.
Tall buildings and chimneys, and towers etc., can easily be
identified for their characteristic shadows. Shadows on the other
hand can sometimes render interpretation difficult i.e. dark slope
shadows covering important detail.
(vii) Site factor or Topographic Location
 Relative elevation or specific location of objects can be helpful
to identify certain features.
 For example, sudden appearance or disappearance of vegetation
is a good clue to the underlying soil type.
(viii) Pattern
 Pattern is the orderly spatial arrangement of geological
topographic or vegetation features. This spatial arrangement
may be two-dimensional (plan view) or 3-dimensional (space).
www.EnggTree.com
 Geological pattern may be linear or curved. Linear pattern are
formed of a very large number of continuous or discontinuous
short ticks which when viewed by eye appear to be continuous
lines.
 Examples of linear geological pattern are faults, fractures, joints,
dykes, bedding planes, anticlines etc.,
 Examples of curved features are plunging anticlines and folds.
Lineaments or lineations may be short, medium or long running
for several hundred kilometers. These are very important
expressions of the lithologic characters of the underlying rocks
and the attitude of the rock
 bodies, spacing of planes of bedding and other structural
weaknesses and the control extendedby them over the surface
features. Vegetation pattern may be of the 'Block' type or
'Alignment' type.
 The 'Alignment' type may be further subdivided into the Linear,
Parallel and curved type.

Downloaded from EnggTree.com


EnggTree.com

 Alignments are due to narrow rockbands or faults. Since faults


retain moisture, vegetation is aligned along the fault lines.
Example of topographic pattern is the typical drainage
patterns(controlled and uncontrolled type). The uncontrolled
types are those, which are purely governed by topography, i.e.,
the slops whereas the controlled type are those, which are
governed by the underlying geological formations.

Key Elements of Visual Image Interpretation


 Keys that provide useful reference of refresher materials and
valuable trainingaidswww.EnggTree.com
for novice interpreters are called image
interpretation keys.
 These image interpretation keys are very much useful for the
interpretation of complex imageries or photographs.
 These keys provide a method of organising the information in a
consistent manner and provide guidance about the correct
identification of features or conditions on the images.
 Ideally, it consists of two basic parts'
(i) a collection of annotated or captioned images (stereopalrs)
illustrative of the features or conditions to be identified, and
(ii) a graphic or word description that sets forth in some
systematic fashion the image recognition characteristics of
those features or conditions. There are two types of keys:
selective key and elimination key.
Selective Key
 Selective key is also called reference key which contains
numerous examples images with supporting text.

Downloaded from EnggTree.com


EnggTree.com

 The interpreter select one example image that most nearly


resembles the fracture or condition found on the image under
study.
Elimination Key
 An elimination key is arranged so that interpretation process
step by step from general to specific, and leads to the
elimination of all features of conditions except the one being
identified.
 Elimination keys are also called dichotomous keys where the
interpreter makes a series of choices between two alternatives
and progressively eliminates all but one possible answer.

DIGITAL INTERPRETATION
Digital interpretation in remote sensing refers to the process of
analyzing and extracting meaningful information from digital imagery
acquired by satellite, aerial, or other remote sensing platforms. Unlike
traditional visual interpretation, which relies on human analysts to
interpret features in photographs or maps, digital interpretation
www.EnggTree.com
involves the use of computer-based techniques to automate or assist in
the analysis of remote sensing data.The steps carried out are
1. Image Processing
2. Feature Extraction
3. Change Detection
4. Quantitative Analysis
5. Integration with GIS and Modeling
6. Validation and Accuracy Assessment

Downloaded from EnggTree.com


EnggTree.com

Figure data interpretation


Image Processing:
 Digital interpretation begins with image processing, which
involves a series of computational techniques to enhance,
correct, and manipulate remote sensing imagery.
 Preprocessing steps may include radiometric and geometric
www.EnggTree.com
correction, atmospheric correction, noise reduction, and image
fusion to improve the quality and usability of the data.
Feature Extraction:
 Digital interpretation techniques are used to automatically or
semi-automatically extract features of interest from remote
sensing imagery.
 Feature extraction algorithms identify and delineate objects or
land cover classes based on their spectral, spatial, and textural
characteristics.
 Common feature extraction methods include classification,
segmentation, object-based image analysis (OBIA), and
machine learning algorithms such as supervised and
unsupervised classification.
Change Detection:
 Digital interpretation facilitates the detection and analysis of
temporal changes in remote sensing data.

Downloaded from EnggTree.com


EnggTree.com

 Change detection algorithms compare multiple images acquired


at different times to identify areas of change, such as urban
expansion, deforestation, land cover conversion, or natural
disasters.
 Techniques such as image differencing, image rationing, and
time-series analysis are used to quantify and characterize the
magnitude and extent of changes over time.
Quantitative Analysis:
 Digital interpretation enables quantitative analysis of remote
sensing data, allowing for the measurement and extraction of
numerical information from imagery.
 Quantitative analysis may include calculating vegetation
indices, estimating land surface temperature, determining object
heights or volumes, and deriving biophysical parameters such as
biomass or soil moisture.
 These quantitative measurements provide valuable insights into
environmental processes, ecosystem dynamics, and land surface
www.EnggTree.com
characteristics.
Integration with GIS and Modeling:
 Digital interpretation outputs are often integrated with
geographic information systems (GIS) and spatial analysis tools
to perform further analysis, visualization, and modeling.
 GIS allows for the spatial representation, manipulation, and
overlay of remote sensing data with other geospatial datasets,
enabling comprehensive spatial analysis and decision-making.
 Digital interpretation results can be used as input for
environmental modeling, land use planning, resource
management, and disaster risk assessment.

Validation and Accuracy Assessment:

Downloaded from EnggTree.com


EnggTree.com

 Digital interpretation results are validated and assessed for


accuracy through ground truth measurements, reference data, or
validation studies.
 Accuracy assessment techniques compare digital interpretation
outputs with independently collected data to evaluate the
reliability and precision of the analysis.
 Validation ensures the quality and credibility of the digital
interpretation results for informed decision-making and
scientific research.
CONCEPTS OF IMAGE RECTIFICATION
Introduction
 As seen in the earlier chapters, remote sensing data can be
analysed using visual image interpretation techniques if the data
are in the hardcopy or pictorial form. It is used extensively to
locate specific features and conditions, which are then geocoded
for inclusion in GIS.
 Visual image interpretation techniques have certain
disadvantages and www.EnggTree.com
may require extensive training and are labour
intensive. In this technique, the spectral characteristics are not
always fully evaluated because of the limited ability of the eye
to discern tonal values and analyse the spectral changes.
 If the data are in digital mode, the remote sensing data can be
analysed using digital image processing techniques and such a
database can be used in raster GIS. In applications where
spectral patterns are more informative, it is preferable to analyse
digital data rather than pictorial data.
 In today's world of advanced technology where most remote
sensing data are recorded in digital format, virtually all image
interpretation and analysis involves some element of digital
processing.
 Digital image processing may involve numerous procedures
including formatting and correcting of the data, digital
enhancement to facilitate better visual interpretation, or even
automated classification of targets and features entirely by
computer.

Downloaded from EnggTree.com


EnggTree.com

 In order to process remote sensing imagery digitally, the data


must be recorded and available in a digital form suitable for
storage on a computer tape or disk. Obviously, the other
requirement for digital image processing is a computer system,
sometimes referred to as an image analysis system, with the
appropriate hardware and software to process the data.
 Several commercially available software systems have been
developed specifically for remote sensing image processing and
analysis.
 For discussion purposes, most of the common image processing
functions available in image analysis systems can be categorized
into the following four categories:
1. Preprocessing Image Enhancement
2. Image Transformation
3. Image Classification and Analysis

www.EnggTree.com
PREPROCESSING
 Preprocessing functions involve those operations that are
normally required prior to the maindata analysis and extraction
of information, and are generally grouped as radiometric or
geometric corrections.
 Radiometric corrections include correcting the data for sensor
irregularities and unwanted sensor or atmospheric noise, and
converting the data so they accurately represent the reflected or
emitted radiation measured by the sensor.
 Geometric corrections include correcting for geometric
distortions due to sensor-Earth geometry variations, and
conversion of the data to real world coordinates (e.g. latitude
and longitude) on the Earth's surface.
 The objective of the second group of image processing functions
grouped under the term of image enhancement, is solely to
improve the appearance of the imagery to assist in visual
interpretation and analysis.
 Examples of enhancement functionsinclude contrast stretching
to increase the tonal distinction between various features in a

Downloaded from EnggTree.com


EnggTree.com

scene, and spatial filtering to enhance (or suppress) specific


spatial patterns in an image.
 Image transformations are operations similar in concept to those
for image enhancement. However, unlike image enhancement
operations which are normally applied only to a single channel
of data at a time, image transformations usually involve
combined processing of data from multiple spectral bands.
Arithmetic operations (i.e. subtraction, addition, multiplication,
division) are performed to combine and transform the original
bands into "new" images which better display or highlight
certain features in the scene.
 We will look at some of these operations including various
methods of spectral or band ratioing, and a procedure called
principal components analysis which is used to more efficiently
represent the information

www.EnggTree.com

Figure image classification

Image classification

Downloaded from EnggTree.com


EnggTree.com

 Image classification is a procedure to automatically categorize


all pixels in an image of a terrain into land cover classes.
 Normally, multispectral data are used to perform the
classification of the spectral pattern present within the data for
each pixel is used as the numerical basis for categorization.
 This concept is dealt under the broad subject, namely, Pattern
Recognition.
 Spectral pattern recognition refers to the family of classification
procedures that utilises this pixel-by-pixel spectral information
as the basis for automated land cover classification.
 Spatial pattern recognition involves the categorization of image
pixels on the basis of the spatial relationship with pixels
surrounding them. Image classification techniques are grouped
into two types, namely
1. Supervised
2. Unsupervised
 The classification process may also include features, such as,
land surface elevation and the soil type that are not derived from
the image. www.EnggTree.com
 A pattern is thus a set of measurements on the chosen features
for the individual to be classified. The classification process
may therefore be considered a form of pattern recognition, that
is, the identification of the pattern associated with each pixel
position in an image in terms of the characteristics of the objects
or on the earth's surface.

Supervised Classification
 A supervised classification algorithm requires a training sample
for each class, that is, a collection of data points known to have
come from the class of interest. The classification is thus based
on how "close" a point to be classified is to each training
sample.
 We shall not attempt to define the word "close" other than to say
that both geometric and statistical distance measures are used in
practical pattern recognition algorithms.

Downloaded from EnggTree.com


EnggTree.com

 The training samples are representative of the known classes of


interest to the analyst.
 Classification methods that relay on use of training patterns are
called supervised classification methods.
 The three basic steps involved in a typical supervised
classification procedure are as follows :
(i) Training stage:
 The analyst identifies representative training areas and develops
numerical descriptions of the spectral signatures of each land
cover type of interest in the scene.
(ii) The classification stage:
 Each pixel in the image data set IS categorized into the land
cover class it most closely resembles. If the pixel is
insufficiently similar to any training data set it is usually labeled
'Unknown'.
(iii) The output stage:
 The results may be used in a number of different ways. Three
typical
forms of output products are thematic maps, tables and digital
www.EnggTree.com
data files which become input data for GIS.
 The output of image classification becomes input for GIS for
spatial analysis of the terrain. Figure depicts the flow of
operations to be performed during image classification of
remotely sensed data of an area which ultimately leads to create
database as an input for GIS.

Downloaded from EnggTree.com


EnggTree.com

Figure Basic steps supervised classification


 There are a number of powerful supervised classifiers based on
the statistics, which are commonly, used for various
applications.
 A few of them are a minimum distance to means method,
average distance method, parallelepiped method, maximum
likelihood method, modified maximum likelihood method,
baysian's method, decision tree classification, and discriminant
functions.
 The principles and working algorithms of all these supervised
classifiers are available in almost all standard books on remote
sensing and so details are not provided here.
 Since all the supervised classification methods use training data
samples, it is more appropriate to consider some of the
fundamental characteristics of training data.
Training Dataset
 A training dataset is a set of measurements (points from an
image) whose category membership is known by the analyst.
 This set must be selected based on additional information
www.EnggTree.com
derived from maps, field surveys, aerial photographs, and
analyst's knowledge of usual spectral signatures of different
cover classes.
 Selecting a good set of training points is one of the most critical
aspects of the classification procedure.
 These guidelines are as following:
(i)Select sufficient number of points for each class. If each
measurement vector has N features, then select N+1 points per
class and the practical minimum is 10*N per class. If the class
shows a lot of variability (the scatter plot showing considerable
spreading or scatter among training points), select a larger number
of points, subject to practical limits of time, effort and expense.
The more the training points, the better the "extra points" to
evaluate the accuracy of the classifier.

Downloaded from EnggTree.com


EnggTree.com

www.EnggTree.com
Figure Image Classification
The more the points, the more accurate the classification will be.
(ii)Select training data sets which are representative of the classes of
interest that show both typical average feature values and a typical
degree of variability. For each class, select several training areas on
the image, instead of just one. Each training area should contain a
moderately large number of pixels. Pick training areas from
seemingly heterogeneous or appearing regions. Pick training areas
that are widely and spatially dispersed, across the full image. For each
class, select the training areas which are uniformly distributed across
the image and with high density.
(iii)Check that selected areas have unimodel distributions
(histograms). A bimodal histogram suggests that pixels from two
different classes may be included in the training sample.
(iv)Select training sets (physically) using a computer-based
classification system:
Poorest method: Using coordinates of training points or training
regions directly.

Downloaded from EnggTree.com


EnggTree.com

Better method: using joystick, trackball, light pen, directly on the


image.
For example. EASI/PACE : The program should show the histograms,
mean and standard deviations for each region selected, and for each
class in total.
The Program should allow to iterate; do classification using one set of
training points, then come back and modify training sets and class
definitions without starting all over again. There should be options to
combine classes from previous classification.
(v) The program should allow one to designate half of the points as
training points, and the other half to test the accuracy of the trained
classifier. Before it is used, the training set should be evaluated by
examining scatterplots and/or histograms for each class. It should
show unimodel distributions, hopefully approximating normal
distributions. If not unimodel, one may want to select new training
sets. After the discriminant functions and the classification rule is
derived, accuracy must be tested.
 Two acceptable techniques which are commonly used are:
(a) Designate a randomlywww.EnggTree.com
selected half of the training points as test
points, before developing classifier. Use the other half for training.
Then classify the half of the data not used for training.
Develop contingency table (confusion matrix) to indicate probability
of error in each class. This procedure is actually a measure of the
consistency of the classifier.
(b) Randomly select a set of pixel regions from the image of an
unknown class. Classify them using the discriminant function and
rules developed from the training set. Then verify the correctness of
the classification (again with a confusion matrix) by checking the
identity of these regions using external information sources like maps
and aerial photos.
(vi)Separability of classes: So far, we have looked at an ideal situation
where there is no overlap between different classes. In reality the
classes are likely to overlap. It can be seen that the less the overlap
between classes the lower the chance of misclassifying a given pixel.
Classes that have little overlap is said to be highly separable.

Unsupervised Classification

Downloaded from EnggTree.com


EnggTree.com

 Unsupervised classification algorithms do not compare .points


to be classified with training data.
 Rather, unsupervised algorithms examine a large number of
unknown data vectors and divide them into classes based on
properties inherent to the data themselves.
 The classes that result stem from differences observed in the
data. In particular, use is made of the notion that data vectors
within a class should be in some sense mutually close together
in the measurement space, whereas data vectors in different
classes should be comparatively well separated.
 If the components of the data vectors represent the responses in
different spectral bands, the resulting classes might be referred
to as spectral classes, as opposed to information classes, which
represent the ground cover types of interest to the analyst.
 The two types of classes described above, information classes
and spectral classes, may not exactly correspond to each other.
For instance, two information classes, corn and soya beans, may
look alike spectrally. We would say that the two classes are not
separable spectrally.
www.EnggTree.com
 At certain times of the growing season corn and soya beans are
not spectrally distinct while at other times they are. On the other
hand a single information class may be composed of two
spectral classes.
 Differences in planting dates or seed variety might result in the
information
 class" corn" being reflectance differences of tasseled and
untasseled corn.

Downloaded from EnggTree.com


EnggTree.com

www.EnggTree.com

Image enhancement
 Low sensitivity of the detectors, weak signal of the objects
present on the earth surface, similar reflectance of different
objects and environmental conditions at the time of recording
are the major causes of low contrast of the image.
 Another problem that complicates photographic display of
digital image is that the human eye is poor at discriminating the
slight radiometric or spectral differences that may characterize
the features. The main aim of digital enhancement is to amplify
these slight differences for better clarity of the image scene.
 This means digital enhancement increases the separability
(contrast) between the interested classes or features.
 The digital image enhancement may be defined as some
mathematical operations that are to be applied to digital remote
sensing input data to improve the visual appearance of an image
for better interpretability or subsequent digital analysis. Since
the image quality is a subjective measure varying from person to

Downloaded from EnggTree.com


EnggTree.com

person , there is no simple rule which may produce a single best


result.

Figure image enhancement


 Normally, two or morewww.EnggTree.com
operations on the input image may
suffice to fulfill the
 desire of the analyst, although the enhanced product may have a
fraction of the total information stored in the original image.
 As in many outer areas of knowledge, the distinction between
one type of analysis and another is a matter of personal taste and
need of the interpreter. In remote sensing literature, many digital
enhancement algorithms are available. They are contrast
stretching enhancement, rationing, linear combinations,
principal component analysis, and spatial filtering.
 Broadly, the enhancement techniques are categorized as point
operations and local operations.
 Point operations modify the values of each pixel in an image
data set independently, whereas local operations modify the
values of each pixel in the context of the pixel values
surrounding it.
 Point operations include contrast enhancement and band
combinations, but spatial filtering is an example of local
operations. In this section, contrast enhancement, linear contrast

Downloaded from EnggTree.com


EnggTree.com

stretch ,histogram equalization, logarithmic contrast


enhancement, and exponential contrast enhancement are
considered.

Contrast Enhancement
 The sensors mounted on board the aircraft and satellites have to
be capable of detecting upwelling radiance levels ranging from
low (oceans) or ice).

www.EnggTree.com

Figure Contrast Enhancement

 For any particular area that is being imaged it is unlikely that the
full dynamic range of the sensor will be used and the
corresponding image is dull and lacking in contrast or over
bright. In terms of the RGB model, the pixel values are clustered
in a narrow range of grey levels.
 If this narrow range of gray levels could be altered so as to fit
the full range of grey levels, then the contrast between the dark
and light areas of the image would be improved while
maintaining the relative distribution of the gray levels. It is
indeed the manipulation of look-up table values.

Downloaded from EnggTree.com


EnggTree.com

 The enhancement operations are normally applied to image data


after the appropriate restoration procedures have been
performed. The most commonly applied digital enhancement
techniques will be considered now.
 The sensitivity of remote sensing detectors was designed to
record a wide range of terrain brightness from black asphalt and
basaltic rocks to White Sea ice under a wide range of lighting
conditions. In general, few of the scenes have the full brightness
range to produce an image with optimum contrast ratio it is
inevitable to utilize the entire dynamic range. Digital contrast
enhancement is thus of prime importance.
 The objective of contrast stretching is to expand the narrow
dynamic range of gray values (digital numbers) typically present
in an input image.
 A variety of contrast stretching algorithms is available and is
broadly categorized as linear contrast stretching and non-linear
contrast stretching.
Spatial Filtering Techniques
 Some of the most commonly used filtering techniques are given
www.EnggTree.com
below.
a. Low Pass Filters
b. Median Filter
c. High Pass Filters
d. Filtering for Edge Enhancement

 A characteristic of remotely sensed images is a parameter called


spatial frequency, defined as the number of changes in
brightness values per unit distance for any particular part of an
image.
 If there are few changes in brightness value over a given area it
is termed as a low frequency area. If the brightness values
changes dramatically over very short distances, this is called
high frequency area.
 Algorithms which perform image enhancement are called filters
because they suppress certain frequencies and pass (emphasise)
others. Filters that pass high frequencies while emphasizing fine

Downloaded from EnggTree.com


EnggTree.com

detail and edges called high frequency filters, and filters that
pass low frequencies called low frequency filters.
 Filtering is performed by using convolution windows. These
windows are called mask, template filter or kernel. In the
process of filtering, the window is moved over the input image
from extreme top left hand corner of the scene.
 The discrete mathematical function transforming the original
input image digital number to a new digital value.
 First it will move along the line. As soon as the line is complete,
it will restart
 for the next line for covering the entire image.
 The mask window may be rectangular (1 x 3, or 1 x 5 pixels)
size or square (3 x 3, 5 x 5 or 7 x 7 pixels size). Each pixel of
the window is given a weightage. For low pass filters all
thenweights in the window will be positive and for high pass
filter all the values may be negative or zero, but the central pixel
will be positive with higher weightage value.
 In the case of high pass filter the algebraic sum of all the
weights in the window will be a zero.
www.EnggTree.com
 Many types of mask windows of different sizes can be designed
by changing the size and varying weightage within the window.
 The simplest form of mathematical function performed in
filtering operation is
 neighbourhood averaging.
 Another commonly used discrete function is to calculate the
sum
 of the products given by the elements of the mask and the input
image pixel digital numbers of the central pixel digital number
in the moving window.

Downloaded from EnggTree.com

You might also like