Fundamentals of Remote Sensing Course
Fundamentals of Remote Sensing Course
B.Sc., Geography
Semester - V
FUNDAMENTALS OF REMOTE
SENSING
Department of Geography
School of Sciences
Tamil Nadu Open University
Chennai – 600 015.
BACHELOR OF SCIENCE IN GEOGRAPHY
BGEOS-52
SEMESTER-V
Department of Geography,
School of Science
Tamil Nadu Open University
577, Anna Salai, Saidapet, Chennai - 600 015
www.tnou.ac.in
August 2023
Name of Programme: B.Sc Geography
Printed by:
BGEOS 52 - FUNDAMENTALS OF REMOTE SENSING
Syllabus Details
1. https://gistbok.ucgis.org/bok-topics
2. https://gisgeography.com/remote-sensing-earth-observation-guide/
3. https://www.iirs.gov.in/
4. https://www.heavy.ai/technical-glossary/remote-sensing
BGEO21_52 - FUNDAMENTALS OF REMOTE SENSING
Unit Page
Contents
No. Number
1|Page
Unit 1
Definition and Types: Aerial, Satellite and
Radar
Structure
Overview
Learning Objectives
Overview
An aerial photograph, in broad terms, is any photograph taken from the air.
Normally, air photos are taken vertically from an aircraft using a highly
accurate camera. There are several things you can look for to determine
what makes one photograph different from another of the same area
including type of film, scale, and overlap. Other important concepts used in
aerial photography are stereoscopic coverage, fiducial marks, focal length,
roll and frame numbers, and flight lines and index maps. Earth observation
satellites gather information for reconnaissance, mapping, monitoring the
weather, ocean, forest, etc. Space telescopes take advantage of outer
spaces near perfect vacuum to observe objects with the entire
electromagnetic spectrum. Because satellites can see a large portion of the
Earth at once, communications satellites can relay information to remote
places.
The signal delay from satellites and their orbit's predictability are used in
satellite navigation systems, such as GPS. Space probes are satellites
designed for robotic space exploration outside of Earth, and space stations
2|Page
are in essence crewed satellites. Radar was developed secretly for military
use by several countries in the period before and during World War II. A key
development was the cavity magnetron in the United Kingdom, which
allowed the creation of relatively small systems with sub-meter resolution.
The term RADAR was coined in 1940 by the United States Navy as an
acronym for "radio detection and ranging”. The modern uses of radar are
highly diverse, including air and terrestrial traffic control, radar astronomy,
air-defense systems, anti-missile systems, marine radars to locate
landmarks and other ships, aircraft anti-collision systems, ocean
surveillance systems, outer space surveillance and rendezvous systems,
meteorological precipitation monitoring, altimetry and flight control systems,
guided missile target locating systems, self-driving cars, and ground-
penetrating radar for geological observations.
Learning Objectives
After studying this unit, you would be able to understand the following.
• Aerial and its types
• Satellite and its types
• Radar and its types
3|Page
1.1.1 Types of Aerial Photographs based on the position of the
Camera Axis
Based on the position of the camera axis, aerial photographs are classified
into the following types.
a. Vertical Photographs
While taking aerial photographs, two distinct axes are formed from the
camera lens centre, one towards the ground plane and the other towards
the photo plane. The perpendicular dropped from the camera lens centre to
the ground plane is termed as the vertical axis, whereas the plumb line
drawn from the lens centre to the photo plane is known as the
photographic/optical axis. When the photo plane is kept parallel to the
ground plane, the two axes also coincide with each other. The photograph
so obtained is known as vertical photograph. However, it is normally very
difficult to achieve perfect parallelism between the two planes because the
aircraft flies over the curved surface of the earth. The photographic axis,
therefore, deviates from the vertical axis. If such a deviation is within the
range of plus or minus 3°, the near vertical aerial photographs are obtained.
Any photography with an unintentional deviation of more than 3° in the
optical axis from the vertical axis is known as a titled photograph.
4|Page
b. Low Oblique
An aerial photograph taken with an intentional deviation of 15° to 30° in the
camera axis from the vertical axis is referred to as the low oblique
photograph. This kind of photograph is often used in reconnaissance
surveys.
5|Page
1.2 Satellites
A satellite or artificial satellite is an object intentionally placed into orbit in
outer space. Except for passive satellites, most satellites have an electricity
generation system for equipment on board, such as solar panels or
radioisotope thermoelectric generators (RTGs). Most satellites also have a
method of communication to ground stations, called transponders. Many
satellites use a standardized bus to save cost and work, the most popular
of which is small CubeSats. Similar satellites can work together as a group,
forming constellations. Because of the high launch cost to space, satellites
are designed to be as lightweight and robust as possible.
Satellites are placed from the surface to orbit by launch vehicles, high
enough to avoid orbital decay by the atmosphere. Satellites can then
change or maintain the orbit by propulsion, usually by chemical or ion
thrusters. In 2018, about 90% of satellites orbiting Earth are in low Earth
orbit or geostationary orbit; geostationary means the satellites stay still at
the sky. Some imaging satellites chose a Sun-synchronous orbit because
they can scan the entire globe with similar lighting. As the number of
satellites and space debris around Earth increases, the collision threat is
becoming more severe.
1.2.1 Types of Satellites
Four different types of satellite orbits have been identified depending on the
shape and diameter of each orbit:
• GEO (Geostationary Earth Orbit) at 36,000 kms above earth’s
surface.
• LEO (Low Earth Orbit) at 500-1500 kms above earth’s surface.
6|Page
MEO refers to satellites between the LEO and GEO orbits yet, despite the
huge range distances only 5% of satellites operate in this space. This area
is used largely by navigation satellites, like the European Galileo system.
This is the least common of the satellite orbits, with just 2% in this place.
The vast majority of these are military or government missions, with just
three that are for commercial use.
1.3 Radar
Radar is a detection system that uses radio waves to determine the distance
(ranging), angle, and radial velocity of objects relative to the site. It can be
used to detect aircraft, ships, spacecraft, guided missiles, motor vehicles,
weather formations, and terrain. A radar system consists of a transmitter
producing electromagnetic waves in the radio or microwaves domain, a
transmitting antenna, a receiving antenna (often the same antenna is used
for transmitting and receiving) and a receiver and processor to determine
properties of the objects. Radio waves (pulsed or continuous) from the
transmitter reflect off the objects and return to the receiver, giving
information about the objects' locations and speeds.
1.3.1 Types of Radar
a) Bistatic Radar
This type of radar system includes a Tx-transmitter & an Rx- receiver that is
divided through a distance that is equivalent to the distance of the estimated
object. The transmitter & the receiver are situated at a similar position is
called a monastic radar whereas the very long-range surface to air & air to
air military hardware uses the bistatic radar.
b) Doppler Radar
It is a special type of radar that uses the Doppler Effect to generate data
velocity regarding a target at a particular distance. This can be obtained by
transmitting electromagnetic signals in the direction of an object so that it
analyses how the action of the object has affected the returned signal’s
frequency. This change will give very precise measurements for the radial
component of an object’s velocity within relation toward the radar. The
applications of these radars involve different industries like meteorology,
aviation, healthcare, etc.
c) Monopulse Radar
This kind of radar system compares the obtained signal using a particular
radar pulse next to it by contrasting the signal as observed in numerous
directions otherwise polarizations. The most frequent type of monopulse
7|Page
radar is the conical scanning radar. This kind of radar evaluates the return
from two ways to measure the position of the object directly. It is significant
to note that the radars which are developed in the year 1960 are monopulse
radars.
d) Passive Radar
This kind of radar is mainly designed to notice as well as follow the targets
through processing indications from illumination within the surroundings.
These sources comprise communication signals as well as commercial
broadcasts. The categorization of this radar can be done in the same
category of bistatic radar.
e) Instrumentation Radar
These radars are designed for testing aircraft, missiles, rockets, etc. They
give different information including space, position, and time both in the
analysis of post-processing & real-time.
f) Weather Radars
These are used to detect the direction and weather by using radio signals
through circular or horizontal polarization. The frequency choice of weather
radar mainly depends on a compromise of performance among attenuation
as well as precipitation reflection as an outcome of atmospheric water
steam. Some types of radars are mainly designed to employ Doppler shifts
to calculate the wind speed as well as dual polarization to recognize the
types of rainfall.
g) Mapping Radar
These radars are mainly used to examine a large geographical area for the
applications of remote sensing & geography. As a result of synthetic
aperture radar, these are restricted to quite stationary targets. There are
some radar systems used to detect humans after walls that are more
different as compared with the ones found within construction materials.
h) Navigational Radars
Generally, these are the same to search radars but, they are available with
small wavelengths that are capable of replicating from the ground & from
stones. These are commonly used on commercial ships as well as long-
distance airplanes. There are different navigational radars like marine
radars which are placed commonly on ships to avoid a collision as well as
navigational purposes.
8|Page
i) Pulsed Radar
Pulsed RADAR sends high power and high-frequency pulses towards the
target object. It then waits for the echo signal from the object before another
pulse is sent. The range and resolution of the RADAR depend on the pulse
repetition frequency. It uses the Doppler shift method. The principle of
RADAR detecting moving objects using the Doppler shift works on the fact
that echo signals from stationary objects are in the same phase and hence
get cancelled while echo signals from moving objects will have some
changes in phase.
Let Us Sum Up
An aerial photograph, in broad terms, is any photograph taken from the air.
Normally, air photos are taken vertically from an aircraft using a highly
accurate camera. There are several things you can look for to determine
what makes one photograph different from another of the same area
including type of film, scale, and overlap. A satellite or artificial satellite is an
object intentionally placed into orbit in outer space. Except for passive
satellites, most satellites have an electricity generation system for
equipment on board, such as solar panels or radioisotope thermoelectric
generators (RTGs).
Glossary
Vertical Photographs: While taking aerial photographs, two distinct axes are
formed from the camera lens centre, one towards the ground plane and the
other towards the photo plane.
Low Oblique: An aerial photograph taken with an intentional deviation of 15°
to 30° in the camera axis from the vertical axis is referred to as the low
oblique photograph. This kind of photograph is often used in
reconnaissance surveys.
High Oblique: The high oblique are photographs obtained when the camera
axis intentionally inclined about 60° from the vertical axis. Such photography
is useful in reconnaissance surveys.
• Low Oblique
• High Oblique
9|Page
2. Types of Satellites
• Geostationary Earth Orbit
• Low Earth Orbit
• Bistatic Radar
• Doppler Radar
• Monopulse Radar
• Passive Radar
• Instrumentation Radar
• Weather Radars
• Mapping Radar
• Navigational Radars
• Pulsed Radar
Suggested Readings
1. Campbell J., 1989. Introduction to Remote Sensing, Guilford, New York.
2. Thomas M., Lillesand, Ralph, Kefer, W., 1994. Remote Sensing and
Image Interpretation, John Wiley & Sons, New York.
3. Sarkar, A. (2015): Practical Geography: A Systematic Approach. Orient
Black Swan Private Ltd., New Delhi.
4. Anji Reddy, M. (2008): Textbook of Remote Sensing and Geographic
Information System, B.S. Publication, Hyderabad.
5. Campbell, J. B., (2007): Introduction to Remote Sensing, Guildford
Press.
10 | P a g e
Unit 2
History, Organization and Development of
Space Programmes
Structure
Overview
Learning Objectives
Overview
Remote sensing began in the 1840s as balloonists took pictures of the
ground using the newly invented photo camera. Perhaps the most novel
platform at the end of the last century is the famed pigeon fleet that operated
as a novelty in Europe. The concept of photography was developed by
Greek mathematician Aristotle by using a pinhole camera in the 5th and 4th
centuries. Photography was originally invented in the early 1800s. The
world’s first chemical photograph was taken in 1826 by Joseph Niepce of
France using a process known as heliography. In this unit you can learn the
historical development of remote sensing and the development of Indian
remote Sensing.
11 | P a g e
Learning Objectives
After studying this unit, you will be able to know the following.
• Development of Remote Sensing
• Milestones of Remote Sensing
• History of India’s Space Programme
12 | P a g e
review millions of stereoscopic aerial images to detect hidden Nazi rocket
bases. During the Cold War, the use of aerial reconnaissance increased
with U-2 aircraft flying at ultra-high altitudes (70,000 ft) to capture imagery.
Aerial photography grew quickly following the war and was soon employed
for a variety of purposes. These new photographs provided people with a
realistic vantage of the world few had seen before. Aerial photography was
a much faster and cheaper way to produce maps compared to traditional
ground surveys.
In the United States, aerial photography was used for farm programs
beginning in the Dust Bowl Era of the 1930s with the passing of the
Agricultural Adjustment Act. The agency is then known as the Agricultural
Adjustment Administration (AAA), began its aerial photography program in
1937 and by 1941 the AAA has flown and acquired aerial photographs of
more than 90% of the agricultural land in the US. The Agriculture
Department's aerial photography program became tool for conservation
and land planning as well as an instrument of fair and accurate
measurement. The agricultural agencies have since been consolidated and
are now known as Farm Service Agency (FSA). The FSA is still responsible
for aerial imagery programs in the US. Aerial photography remained the
primary tool for depicting the Earth's surface until the early 1960s.
Landsat 1
13 | P a g e
The development of satellite-based remote sensing began with the "space
race" in the 1950s and 1960s. In 1957 the Soviet Union launched Sputnik
1, the world's first artificial satellite. The United States followed in 1960 with
the successful launch of Explorer 1. The next decades brought about rapid
developments in satellites and imaging technology. The first successful
meteorological satellite (TIROS-1) was launched in 1960. In 1972 Landsat
1, the first earth resource satellite, was launched by the US. The original
goal of the Landsat program was to collect data from the Earth through
remote sensing techniques. Landsat 1 was originally named Earth
Resources Technology Satellite 1 and was later renamed Landsat 1. The
Landsat program has continued for 45 years with Landsat 8 launched in
2013.
Since the launch of Sputnik in 1957, thousands of satellites have been
launched. There is a myriad of commercial and government satellites in
operation today, many of which are used for remote sensing applications.
There are currently over 3,600 satellites orbiting the Earth, but only
approximately 1400 are operational. Of these satellites, well over 100 are
Earth-observing satellites that carry a variety of different sensors to
measure and capture data about the Earth. These satellites are often
launched by governments to monitor Earth's resources, but private
commercial companies are becoming increasingly active in launching earth-
observing satellites as well.
14 | P a g e
Hoffman first to sense from an aircraft in
5 1919
thermal IR.
15 | P a g e
resources.
16 | P a g e
ERS-2, Radarsat-1 and IRS-1C.
36 2012 RISAT-1
17 | P a g e
ScatSat,
18 | P a g e
and atmospheric studies and cartographic applications resulted in the
realisation of theme-based satellite series, namely, (i) Land/water resources
applications (RESOURCESAT series and RISAT series); (ii)
Ocean/atmospheric studies (OCEANSAT series, INSAT-VHRR, INSAT-3D,
Megha-Tropiques and SARAL); and (iii) Large scale mapping applications
(CARTOSAT series).
IRS-1A development was a major milestone in the IRS programme. On this
occasion of 30 years of IRS-1A and the fruitful journey of the Indian remote
sensing programme, it is important to look back at the achievements of the
Indian Space Programme, particularly in remote sensing applications,
wherein India has become a role model for the rest to follow. Significant
progress continued in building and launching the state-of-the-art Indian
Remote Sensing Satellite as well as in operational utilisation of the data in
various applications to the nation.
Today, the array of Indian Earth Observation (EO) Satellites with imaging
capabilities in visible, infrared, thermal and microwave regions of the
electromagnetic spectrum, including hyperspectral sensors, have helped
the country in realising major operational applications. The imaging sensors
have been providing spatial resolution ranging from 1 km to better than 1m;
repeat observation (temporal imaging) from 22 days to every 15 minutes
and radiometric ranging from 7 bit to 12 bit, which has significantly helped
in several applications at the national level. In the coming years, the Indian
EO satellites are heading towards further strengthened and improved
technologies, taking cognizance of the learnings/ achievements made in the
years.
2.3.1 Communication Satellites
The Indian National Satellite (INSAT) system is one of the largest domestic
communication satellite systems in the Asia-Pacific region with nine
operational communication satellites placed in Geo-stationary orbit.
Established in 1983 with the commissioning of INSAT-1B, it initiated a major
revolution in India’s communications sector and sustained the same later.
GSAT-17 joins the constellation of INSAT System consisting of 15
operational satellites, namely - INSAT-3A, 3C, 4A, 4B, 4CR and GSAT- 6,
7, 8, 9, 10, 12, 14, 15, 16 and 18. The INSAT system with more than 200
transponders in the C, Extended C and Ku-bands provides services to
telecommunications, television broadcasting, satellite news gathering,
societal applications, weather forecasting, disaster warning and Search and
Rescue operations.
19 | P a g e
2.3.2 Earth Observation Satellites
Starting with IRS-1A in 1988, ISRO has launched many operational remote
sensing satellites. Today, India has one of the largest constellations of
remote sensing satellites in operation. Currently thirteen operational
satellites are in Sun-synchronous orbit – RESOURCESAT-1, 2, 2A
CARTOSAT-1, 2, 2A, 2B, RISAT-1 and 2, OCEANSAT-2, Megha-
Tropiques, SARAL and SCATSAT-1, and four in Geostationary orbit-
INSAT-3D, Kalpana & INSAT 3A, INSAT -3DR. A variety of instruments
have been flown on board these satellites to provide necessary data in a
diversified spatial, spectral and temporal resolutions to cater to different
user requirements in the country and for global usage. The data from these
satellites are used for several applications covering agriculture, water
resources, urban planning, rural development, mineral prospecting,
environment, forestry, ocean resources and disaster management.
2.3.3 Satellite Navigation
Satellite Navigation service is an emerging satellite-based system with
commercial and strategic applications. ISRO is committed to providing
satellite-based Navigation services to meet the emerging demands of Civil
Aviation requirements and to meet the user requirements of the positioning,
navigation and timing based on the independent satellite navigation system.
To meet the Civil Aviation requirements, ISRO is working jointly with the
Airport Authority of India (AAI) in establishing the GPS Aided Geo
Augmented Navigation (GAGAN) system. To meet the user requirements of
the positioning, navigation and timing services based on the indigenous
system, ISRO is establishing a regional satellite navigation system called
Indian Regional Navigation Satellite System (IRNSS).
GPS Aided GEO Augmented Navigation (GAGAN)
20 | P a g e
2.3.4 Standard Positioning Service (SPS) Restricted Service (RS)
ISRO has built a total of nine satellites in the IRNSS series; of which eight
are currently in orbit Three of these satellites are in geostationary orbit
(GEO) while the remaining are in geosynchronous orbits (GSO) that
maintain an inclination of 29° to the equatorial plane. The IRNSS
constellation was named “NavIC” (Navigation with Indian Constellation) by
the Honourable Prime Minister, Mr. Narendra Modi and dedicated to the
nation when the successful launch of the IRNSS-1G satellite. The eight
operational satellites in the IRNSS series, namely IRNSS-1A, 1B, 1C, 1D,
1E, 1F, 1G and 1I were launched on Jul 02, 2013; Apr 04, 2014; Oct 16,
2014; Mar 28, 2015; Jan 20, 2016; Mar 10, 2016, Apr 28, 2016; and Apr 12,
2018, respectively. The PSLV-39 / IRNSS-1H was unsuccessful; the
satellite could not reach orbit.
Space Science & Exploration
Indian space programme encompasses research in areas like astronomy,
astrophysics, planetary and earth sciences, atmospheric sciences, and
theoretical physics. Balloons, sounding rockets, space platforms and
ground-based facilities support these research efforts. A series of sounding
rockets are available for atmospheric experiments. Several scientific
instruments have been flown on satellites especially to direct celestial X-ray
and gamma-ray bursts.
AstroSat
AstroSat is the first dedicated Indian astronomy mission aimed at studying
celestial sources in X-ray, optical and UV spectral bands simultaneously.
The payloads cover the energy bands of Ultraviolet, limited optical and X-
ray regime (0.3 keV to 100keV). One of the unique features of AstroSat
mission is that it enables the simultaneous multi-wavelength observations
of various astronomical objects with a single satellite. AstroSat with a lift-off
mass of 1515 kg was launched on September 28, 2015, into a 650 km orbit
inclined at an angle of 6° to the equator by PSLV-C30 from Satish Dhawan
Space Centre, Sriharikota. The minimum useful life of the AstroSat mission
is expected to be 5 years.
2.3.5 Mars Orbiter Mission
Mars Orbiter Mission is ISRO’s first interplanetary mission to planet Mars
with an orbiter craft designed to orbit Mars in an elliptical orbit of 372 km by
80,000 km. Mars Orbiter mission can be termed as a challenging
technological mission and a science mission considering the critical mission
operations and stringent requirements on propulsion, communications, and
21 | P a g e
other bus systems of the spacecraft. The primary driving technological
objective of the mission is to design and realize a spacecraft with a capability
to perform Earth Bound Manoeuvre (EBM), Martian Transfer Trajectory
(MTT) and Mars Orbit Insertion (MOI) phases and the related deep space
mission planning and communication management at nearly 400 million
Km. Autonomous fault detection and recovery also becomes vital for the
mission.
Chandrayaan-1
Chandrayaan-1, India's first mission to Moon, was launched successfully on
October 22, 2008, from SDSC SHAR, Sriharikota. The spacecraft was
orbiting around the Moon at a height of 100 km from the lunar surface for
chemical, mineralogical, and photo-geologic mapping of the Moon. The
spacecraft carried 11 scientific instruments built in India, USA, UK,
Germany, Sweden, and Bulgaria.
Chandrayaan-2
Chandrayaan-2 will be an advanced version of the previous Chandrayaan-
1 mission to Moon.Chandrayaan-2 is configured as a two-module system
comprising of an Orbiter Craft module (OC) and a Lander Craft module (LC)
carrying the Rover developed by ISRO.
Small Satellites
The small satellite project is envisaged to provide a platform for stand- alone
payloads for earth imaging and science missions within a quick turnaround
time. For making the versatile platform for different kinds of payloads, two
kinds of buses have been configured and developed.
22 | P a g e
operative mission between ISRO and CNES with payloads from CNES and
spacecraft bus from ISRO.
University / Academic Institute Satellites
ISRO has influenced educational institutions by its activities like making
satellites for communication, remote sensing, and astronomy. The launch
of Chandrayaan-1 increased the interest of universities and institutions in
making experimental student satellites. Capable Universities and
institutions can venture into space technology on-orbit with guidance and
support from ISRO in the following ways.
Development of Payload (by Universities/Institutions)
Every satellite carries a payload that performs the intended function to
achieve the mission goal and the main bus that supports the payload
function. The Development of payloads may comprise of detectors,
electronics, and associated algorithms, which can be an experimental
piggyback payload on the ISRO’s ongoing (Small or operational) satellite
projects. Design and development of detectors, payload electronics, and
associated algorithms/experiments that enhance the application of space
services to mankind is a continuing R&D activity in several educational
institutions all over the world. Educational institutions can propose the
payloads developed by them to be flown on ISRO’s small satellites.
Satellite Design & Fabrication by Universities/Institutions
Under this option, Universities have to design, fabricate, test the satellite
Bus & Payload and deliver the integrated spacecraft for launch. Technical
guidance in designing, fabrication and testing will be provided by ISRO.
Some critical materials for the space mission also will be provided by ISRO.
The designs and test results will be reviewed by the ISRO team. Under this
option, more than one University/Institution may participate. One among
them will be the focal point for the ISRO. After launch, the collected data will
be archived and disseminated by the university/Institution(s).
2.3.6 Satellite Launchers
Launchers or Launch Vehicles are used to carry spacecraft to space. India
has two operational launchers: Polar Satellite Launch Vehicle (PSLV) and
Geosynchronous Satellite Launch Vehicle (GSLV). GSLV with indigenous
Cryogenic Upper Stage has enabled the launching of up to 2 tonne class of
communication satellites. The next variant of GSLV is GSLV Mk III, with an
indigenous high thrust cryogenic engine and stage, having the capability of
launching 4 tonne class of communication satellites.
23 | P a g e
Satellite Launchers
Let Us Sum Up
Landsat: A series of unmanned NASA satellites that orbit the Earth and
collect multispectral imagery in various visible and infrared bands. Landsat
was an open Earth resources programme that continues to through more
advanced Landsat’s and other satellite resource monitoring programmes.
24 | P a g e
Sentinel Satellite is a family of satellites developed by the European Space
Agency (ESA) under the Copernicus Programme. The Copernicus
Programme is the Earth Observation Programme managed by the ESA,
launched in 1998. RESOURCESAT-2A was launched on December 7,
2016.
Glossary
Geosynchronous orbit is a term used to describe a space. The
geostationary or geosynchronous orbit is one in which the time it takes the
satellite to complete one revolution is equal to the time it takes the Earth to
circle once around its polar axis.
Sun-synchronous Orbit is the orbit where the satellite travels from the north
to the south poles as the Earth rotates below it.
Heliocentric theory argues that the sun is the central body of the solar
system and perhaps of the universe. Everything else (planets and their
satellites, asteroids, comets, etc.) revolves around it.
Suggested Readings
1. Campbell J., 1989. Introduction to Remote Sensing, Guilford, New
York.
2. Thomas M., Lillesand, Ralph, Kefer, W., 1994. Remote Sensing and
Image Interpretation, John Wiley & Sons, New York.
3. Sarkar, A. (2015): Practical Geography: A Systematic Approach.
Orient Black Swan Private Ltd., New Delhi.
4. Anji Reddy, M. (2008): Textbook of Remote Sensing and Geographic
Information System, B.S. Publication, Hyderabad.
25 | P a g e
Press.
6. Chauniyal, D.D., (2010): Sudur Samvedanevam Bhogolik Suchana
Pranali (Hindi), Sharda Pustak Bhawan, Allahabad. 34.
7. Jensen, J.R. (2007): Remote Sensing of the Environment: An Earth
Resource Perspective, Prentice-Hall Inc., New Jersey.
8. Joseph, G. (2005): Fundamentals of Remote Sensing, United Press
India.
9. Campbell J., 1989. Introduction to Remote Sensing, Guilford, New
York.
10. Thomas M., Lillesand, Ralph, Kefer, W., 1994. Remote Sensing and
Image Interpretation, John Wiley & sons, New York.
Web Sources
1. https://gisgeography.com/landsat/
2. https://landsat.gsfc.nasa.gov/
3. https://sentinel.esa.int/web/sentinel/missions/sentinel-1
26 | P a g e
BLOCK 2
27 | P a g e
Unit 3
Introduction to Remote Sensing
Structure
Overview
Learning Objectives
3.1 Introduction to Remote Sensing
Let us Sum Up
Glossary
Check Your Progress
Suggested Readings
Web Sources
Overview
The term "remote sensing" refers to gathering data from distance. Remote
sensing is an exciting field to work in. It also makes use of any or all of the
electromagnetic spectrum. The electromagnetic energy reflected or emitted
by the earth's surface is recorded. Understanding the interactions of energy
with various aspects of the earth's surface helps us in analysing the
remotely sensed image.
Learning Objectives
After studying this unit, you would be able to understand the following.
28 | P a g e
region it represents in the EMS. Some of the images represent reflected
solar radiation in the visible and the near-infrared regions of the
electromagnetic spectrum, others are the measurements of the energy
emitted by the earth’s surface itself. The energy measured in the microwave
region is the measure of relative return from the earth’s surface, where the
energy is transmitted from the vehicle itself. This is known as active remote
sensing since the energy source is provided by the remote sensing platform.
Whereas the systems where the remote sensing measurements depend
upon the external energy source, such as the sun are referred red to as
passive remote sensing systems.
Remote sensing also gives information about the earth’s objects from a
distance without any physical contact with the earth’s surface. Simply we
can define Remote sensing as a technique to collect and interpret the
information about an object, area, incidents (Disasters) or changes (land
use) etc. Without being in physical contact with the earth’s surface. Aircraft,
satellites, and drones are the major platforms for remote sensing of the
earth and its natural resources.
3.1.1 Definitions of Remote Sensing
• Remote sensing is the science of data collection regarding an object or
phenomena without physical contact with the object.
• Remote sensing is the science and art of obtaining information about an
object, area, or phenomenon through the analysis of data acquired by a
device that is not in contact with the object, area, or phenomenon under
investigation.
29 | P a g e
in various wavebands.
• Large area coverage enables regional surveys on a variety of
themes in various wave bands.
• Repetitive coverage allows monitoring of dynamic themes like water,
agriculture etc.
• Easy data acquisition at different scales and resolutions.
30 | P a g e
in surveying, gathering, and retrieving in difficult-to-access locations. It
featured a complicated interface, but it was less reliable. It's not as well
suited to interdepartmental communication. Remote sensing is the process
of extracting information about the Earth's land and ocean surfaces from
images taken from above, using electromagnetic radiation reflected or
emitted from the Earth's surface in one or many portions of the
electromagnetic spectrum.
Geology, forestry, soil science, geography, and urban planning are just a
few of the disciplines that study physical objects. Sensor data is created
when an instrument (such as a camera or radar) records electromagnetic
radiation emitted or reflected from the ground while seeing physical objects.
Because of their unfamiliar overhead perspective, unique resolutions, and
utilisation of spectral regions outside the visible spectrum, sensor data might
appear abstract and strange to many people. As a result, successful sensor
data uses data analysis and interpretation to convert data into information
that can be used to solve real problems like landfill sitting or mineral deposit
searching. These interpretations result in extracted information, which is
made up of sensor data modifications that expose certain types of data. A
more realistic perspective shows how the same sensor data may be
analysed from several perspectives to provide different interpretations.
Finally, the applications, in which the analysed remote sensing data can be
integrated with other data to solve a specific practical problem, such as land
31 | P a g e
use planning, mineral prospecting, or water quality mapping. Applications
are implemented in the field of GIS when digital remote sensing data is
merged with other geospatial data. For example, remote sensing data may
provide accurate land-use information that can be combined with soil,
geologic, transportation, and other information to guide the sitting of a new
landfill.
Let us Sum Up
The science of collecting information about an object or phenomenon
without coming into close contact with it is known as remote sensing. It helps
in surveying, gathering, and retrieving in difficult-to-access locations. It
featured a complicated interface, but it was less reliable. Remote sensing is
the process of extracting information about the Earth's land and ocean
surfaces from images taken from above, using electromagnetic radiation
reflected or emitted from the Earth's surface in one or many portions of the
electromagnetic spectrum.
Glossary
Geosynchronous orbit is a term used to describe a space. The Remote
Sensing: Remote sensing is the science of data collection regarding an
object or phenomena without physical contact with the object.
Suggested Readings
1. Campbell J., 1989. Introduction to Remote Sensing, Guilford, New
York.
2. Thomas M., Lillesand, Ralph, Kefer, W., 1994. Remote Sensing and
Image Interpretation, John Wiley & Sons, New York.
32 | P a g e
3. Sarkar, A. (2015): Practical Geography: A Systematic Approach.
Orient Black Swan Private Ltd., New Delhi.
4. Anji Reddy, M. (2008): Textbook of Remote Sensing and Geographic
Information System, B.S. Publication, Hyderabad.
5. Campbell, J. B., (2007): Introduction to Remote Sensing, Guildford
Press.
Web Sources
1. https://gisgeography.com/landsat/
2. https://landsat.gsfc.nasa.gov/
3. https://sentinel.esa.int/web/sentinel/missions/sentinel-1
33 | P a g e
Unit 4
Sources of Energy and Electromagnetic
Radiations (EMR)
Structure
Overview
Learning Objectives
4.1 Sources of Energy
Overview
34 | P a g e
Learning Objectives
After Learning this lesson, you will be able to:
• Know the Concepts of Electro-Magnetic Radiation
35 | P a g e
Fig. 4.1 Electro Magnetic Radiation
36 | P a g e
are easily described by wave theory, another theory known as particle
theory offers insight into how electromagnetic energy interacts with matter.
An electric field accelerates an atomic particle, such as an electron, forcing
it to travel, resulting in EM radiation. The motion causes oscillating electric
and magnetic fields, which propagate in bundles of light energy called a
photon at right angles to each other. Photons move at the fastest
conceivable speed in the universe, which is 186,282 miles per second
(299,792,458 meters per second) in a vacuum, commonly known as the
speed of light. Frequency, wavelength, and energy are all properties of
waves.
The application of electromagnetic radiation in remote sensing allows for
data transmission. EMR is a type of energy that shows itself in the form of
visible effects when it interacts with matter. All signals collected by the
majority of remote sensing sensors originate from electromagnetic
radiation. Depending on the sensor's features, the source of this energy
changes. The remote sensing system's components are linked by this
energy.
For remote sensing, two features of electromagnetic radiation are very
significant. These are the wavelength and frequency. The wavelength
(lambda λ) is the distance between successive wave crests and measures
the length of one wave cycle. The number of cycles of a wave passing a
fixed point per unit of time is referred to as frequency (v).
c =λν
Where,
c is the speed of light
λ is the wavelength
ν is the frequency
37 | P a g e
4.2.1 Visible Range
The visible portion of the electromagnetic spectrum is small since the
spectral sensitivity of the human eye is only about 0.4 to 0.7 µm. Film and
photo detectors are used to capture these images. Blue (0.4-0.5µm), green
(0.5-0.6µm) and red (0.6-0.7µm) are the wavelengths that fall within this
visual range.
4.2.3 Non-visible Range
• Gamma-ray: The wavelength in this region is less than 0.03 nm.
The upper atmosphere absorbs all the incoming radiation from this
region.
• X-ray: This region has a wavelength range of 0.03 to 3.0 nm. The
atmosphere absorbs these rays as well.
38 | P a g e
• Radio wave: This region's wavelength ranges from 10 cm to 100 km.
This is the portion of the electromagnetic spectrum with the longest
wavelengths. Some classified radars with very long wavelengths
operate in this region.
Let us Sum Up
Electromagnetic radiation is a wave of electric and magnetic fields
propagating at the speed of light C through empty space. In this wave the
electric and magnetic fields change their magnitude and direction each
second. An electric field accelerates an atomic particle, such as an electron,
forcing it to travel, resulting in EM radiation. The motion causes oscillating
electric and magnetic fields, which propagate in bundles of light energy
called a photon at right angles to each other. Frequency, wavelength, and
energy are all properties of waves. The application of electromagnetic
radiation in remote sensing allows for data transmission. EMR is a type of
energy that shows itself in the form of visible effects when it interacts with
matter. All signals collected by most remote sensing sensors originate from
electromagnetic radiation. Depending on the sensor's features, the source
of this energy changes. The remote sensing system's components are
linked by this energy.
Glossary
Gamma-ray: The wavelength in this region is less than 0.03 nm.
X-ray: This region has a wavelength range of 0.03 to 3.0 nm. The
atmosphere absorbs these rays as well.
Ultraviolet: The wavelength extends from 0.03 to 0.4 µm in the ultraviolet
region. Ozone in the upper atmosphere entirely absorbs incoming
wavelengths less than 0.3µm. It causes fluorescence and it has applications
in geology and vegetation.
Infra-Red: There are three logical zones in this spectrum.
• Near Infra-Red (NIR)
• Reflected Infra-Red / Mid Infra-Red (MIR)
• Thermal Infra-Red (TIR)
Microwave: This region has wavelengths ranging from 1mm to 1m. These
are the regions with longer wavelengths that can penetrate clouds, fog, and
rain. One of the active forms of microwave remote sensing is radar.
39 | P a g e
Radio wave: This region's wavelength ranges from 10 cm to 100 km. This
is the portion of the electromagnetic spectrum with the longest wavelengths.
Some classified radars with very long wavelengths operate in this region.
Suggested Readings
40 | P a g e
Information System, B.S. Publication, Hyderabad.
7. Campbell, J. B., (2007): Introduction to Remote Sensing, Guildford
Press.
8. Chauniyal, D.D., (2010): Sudur Samvedanevam Bhogolik Suchana
Pranali (Hindi), Sharda Pustak Bhawan, Allahabad. 34.
Web Sources
1. https://gisgeography.com/landsat/
2. https://landsat.gsfc.nasa.gov/
3. https://sentinel.esa.int/web/sentinel/missions/sentinel-1
41 | P a g e
Unit 5
Electromagnetic Spectrum and
Atmospheric Windows
Structure
Overview
Learning Objectives
5.1 Electromagnetic Spectrum
5.1.1 Spectral Reflectance Patterns Visible Region
Overview
Electromagnetic waves are categorized based on their wavelengths in the
electromagnetic spectrum. The most prevalent unit used to measure
wavelength is micrometer (µm=1×10_6). Every natural and synthetic object
on the earth’s surface and near surface reflects and emits EMR over a range
of wavelengths in its own characteristics way according to its chemical
composition and physical state.
Learning Objectives
After Learning this lesson, you will be able to:
• Know the Concepts of Electro-Magnetic Spectrum
42 | P a g e
5.1 Electromagnetic Spectrum
The electromagnetic spectrum covers electromagnetic waves with
frequencies ranging from below one hertz to above 1025 hertz,
corresponding to wavelengths from thousands of kilometers down to a
fraction of the size of an atomic nucleus. This frequency range is divided
into separate bands, and the electromagnetic waves within each frequency
band are called by different names; beginning at the low frequency (long
wavelength) end of the spectrum these are: radio waves, microwaves,
infrared, visible light, ultraviolet, X-rays, and gamma rays at the high-
frequency (short wavelength) end. The electromagnetic waves in each of
these bands have different characteristics, such as how they are produced,
how they interact with matter, and their practical applications. There is no
known limit for long wavelengths, while it is thought that the short
wavelength limit is in the vicinity of the Planck length. Extreme ultraviolet,
soft X-rays, hard X-rays and gamma rays are classified as ionizing radiation
because their photons have enough energy to ionize atoms, causing
chemical reactions. Exposure to ionizing radiation can be a health hazard,
causing radiation sickness, DNA damage and cancer. Radiation of visible
light and longer wavelengths are classified as nonionizing radiation because
they have insufficient energy to cause these effects.
The electromagnetic radiation spectrum of wavelengths from very shorter
gamma rays (10-10m) to longer radio waves (106m). However, in remote
sensing activities, the most useful regions of the EMR are visible (0.4 to 0.7
μm), reflected infrared (0.7 to 3 μm), thermal infra-red (3 to 5 and 8 to 14μm)
and microwave (0.3 to 300cm) regions. Broad divisions of the
electromagnetic spectrum are summarized in the following table.
Table 5.1 Spectral Band with Wavelength
43 | P a g e
Fig. 5.1 Electro – Magnetic Spectrum
5.1.1 Spectral Reflectance Patterns Visible Region
The visible portion of electromagnetic spectrum is small. Spectral sensitivity
of the human eye is only about 0.4to0.7 μm. The visible ranges are blue,
green, red. blue wavelength range is 0.4 to 0.5 μm, and green is 0.5 to 0.6
μm. Red is 0.6 to 0.7 μm.
5.1.2 Spectral Reflectance Patterns Non-Visible Region
Gamma ray region: In this region the wavelength is less than 0.03 nm. The
incoming radiation completely absorbed by the upper atmosphere and is not
available for remote sensing.
X-Ray region: The wavelength range is 0.03 to 3.0 mm. The measurements
shall be signed as absorbed by atmosphere.
Ultraviolet region: The ultraviolet region falls in the wavelength ranges
from 0.03 to 0.4 μm.the incoming wavelengths less than 0.3 μm. Are
completely absorbed by ozone in the upper atmosphere. It causes
fluorescence and is good in some geological and vegetations applications.
Photographic ultraviolet region available here. These are detectable with
film and photo.
Infra-Red region: There are three logical zones in this spectrum.
44 | P a g e
• Thermal Infra-Red (TIR)
Near Infra-Red (NIR)
This wavelength region is 0.7 to 1.3 μm and interactions of this region with
matter vary with wavelength.
Mid Infra-Red (MIR)
This region’s wavelength range is 1.3 to 3.0 μm. These reflected solar
radiation contains no information about the thermal properties of material.
The band from 0.7 to 0.9 μm. Is detectable with film and is called
photographic IR band. It can be detected using Electro-optical sensor.
Thermal Infra-Red (TIR)
TIR region is grouped in the wavelength region 3 to 5 μm and 8 to14 μm.
These are principal atmospheric windows in the thermal regions. The
images at these wavelengths can be acquired by optical mechanical
scanners and special vidicon systems. The images cannot be detected
using films.
Microwave region:
The wavelength range of this region falls from 1mm to 1m. the microwave
are further divided into different wavelength bands. These are the longer
wavelength regions and can penetrate clouds, fog, and rain. The images
can be acquired in either passive or active mode. Radar is one of the active
forms of microwave remote sensing.
BAND WAVELENGTH
P 30-100cm
L 15-30cm
S 7.5-15cm
C 3.8-7.5cm
X 2.4-3.8cm
Ku 1.7-3.8cm
K 1.1-1.7cm
Ka 0.75-1.1cm
45 | P a g e
Radio Waves region:
The wavelength of this region falls from 10 cm to 100 cm. This region is the
longest wavelength portion of electromagnetic spectrum.
5.1.3 Spectral Reflectance Patterns of Visible and Non-Visible
Regions
46 | P a g e
X-rays: Hot gases in the Universe
also emit X-rays.
Gamma-rays: Radioactive
materials (some natural and
others made by man in things like
nuclear power plants) can emit
gamma-rays. Big particle
accelerators that scientists use to
help them understand what
matter is made of can sometimes
generate gamma-rays. But the
biggest gamma-ray generator of
all is the Universe. It makes
gamma radiation in all kinds of
ways.
47 | P a g e
allowing other wavelengths to pass through. The places where energy
passes through are called "atmospheric windows". We use these "windows"
in remote sensing to peer into the atmosphere from which we can obtain
much information concerning the weather. Most of the sun's energy comes
from visible light and the near infrared portion of the electromagnetic
spectrum. All the outgoing energy emitted by the earth is infrared.
Let us Sum Up
The electromagnetic spectrum is the range of frequencies of
electromagnetic radiation and their respective wavelengths and photon
energies. The electromagnetic spectrum covers electromagnetic waves
with frequencies ranging from below one hertz to above 1025 hertz,
corresponding to wavelengths from thousands of kilometers down to a
fraction of the size of an atomic nucleus. This frequency range is divided
into separate bands, and the electromagnetic waves within each frequency
band are called by different names; beginning at the low frequency end of
the spectrum these are: radio waves, microwaves, infrared, visible light,
ultraviolet, X-rays, and gamma rays at the high-frequency end.
48 | P a g e
Glossary
Gamma-ray: The wavelength in this region is less than 0.03 nm.
X-ray: This region has a wavelength range of 0.03 to 3.0 nm. The
atmosphere absorbs these rays as well.
Ultraviolet: The wavelength extends from 0.03 to 0.4 µm in the ultraviolet
region. Ozone in the upper atmosphere entirely absorbs incoming
wavelengths less than 0.3µm. It causes fluorescence and it has applications
in geology and vegetation.
Microwave: This region has wavelengths ranging from 1mm to 1m. These
are the regions with longer wavelengths that can penetrate clouds, fog, and
rain. One of the active forms of microwave remote sensing is radar.
Radio wave: This region's wavelength ranges from 10 cm to 100 km. This
is the portion of the electromagnetic spectrum with the longest wavelengths.
Some classified radars with very long wavelengths operate in this region.
Suggested Readings
1. Campbell J., 1989. Introduction to Remote Sensing, Guilford, New
York.
2. Thomas M., Lillesand, Ralph, Kefer, W., 1994. Remote Sensing and
Image Interpretation, John Wiley & Sons, New York.
49 | P a g e
York.
4. Thomas M., Lillesand, Ralph, Kefer, W., 1994. Remote Sensing and
Image Interpretation, John Wiley & Sons, New York.
5. Sarkar, A. (2015): Practical Geography: A Systematic Approach.
Orient Black Swan Private Ltd., New Delhi.
6. Anji Reddy, M. (2008): Textbook of Remote Sensing and Geographic
Information System, B.S. Publication, Hyderabad.
7. Campbell, J. B., (2007): Introduction to Remote Sensing, Guildford
Press.
50 | P a g e
Unit 6
Energy Interaction with Atmosphere and
The Earth
Structure
Overview
Learning Objectives
Overview
Before radiation used for remote sensing reaches the Earth's surface it has
to travel through some distance of the Earth's atmosphere. Particles and
gases in the atmosphere can affect the incoming light and radiation. These
effects are caused by the mechanisms of scattering and absorption.
Scattering occurs when particles or large gas molecules present in the
atmosphere interact with and cause the electromagnetic radiation to be
redirected from its original path. How much scattering takes place depends
on several factors including the wavelength of the radiation, the abundance
of particles or gases, and the distance the radiation travels through the
atmosphere. The part of the radiation field that has made it through the
51 | P a g e
atmosphere without being absorbed or scattered back toward space now
reaches the Earth’s surface.
Learning Objectives
After Learning this lesson, you will be able to:
• Know the Interaction of EMR with Atmosphere
• Understand the Interaction of EMR with Earth’s Surface Features
• Nonselective Scattering
6.1.1 Rayleigh Scattering
Rayleigh scattering occurs when particles are very small compared to the
wavelength of the radiation. These could be particles such as small specks
of dust or nitrogen and oxygen molecules. Rayleigh scattering causes
shorter wavelengths of energy to be scattered much more than longer
wavelengths. Rayleigh scattering is the dominant scattering mechanism in
the upper atmosphere. The fact that the sky appears "blue" during the day
is because of this phenomenon. As sunlight passes through the
atmosphere, the shorter wavelengths (i.e., blue) of the visible spectrum are
scattered more than the other (longer) visible wavelengths. At sunrise and
sunset, the light has to travel farther through the atmosphere than at midday
and the scattering of the shorter wavelengths is more complete; this leaves
a greater proportion of the longer wavelengths to penetrate the atmosphere.
52 | P a g e
6.1.2 Mie Scattering
Mie scattering occurs when the particles are just about the same size as the
wavelength of the radiation. Dust, pollen, smoke, and water vapour are
common causes of Mie scattering which tends to affect longer wavelengths
than those affected by Rayleigh scattering. Mie scattering occurs mostly in
the lower portions of the atmosphere where larger particles are more
abundant and dominates when cloud conditions are overcast.
53 | P a g e
6.2 Interaction of EMR with Earth’s Surface Features
Absorption is the other main mechanism at work when electromagnetic
radiation interacts with the atmosphere. In contrast to scattering, this
phenomenon causes molecules in the atmosphere to absorb energy at
various wavelengths. Ozone, carbon dioxide, and water vapour are the
three main atmospheric constituents which absorb radiation. Ozone serves
to absorb the harmful ultraviolet radiation from the sun. Without this
protective layer in the atmosphere our skin would burn when exposed to
sunlight. You may have heard carbon dioxide referred to as a greenhouse
gas. This is because it tends to absorb radiation strongly in the far infrared
portion of the spectrum - that area associated with thermal heating - which
serves to trap this heat inside the atmosphere. Water vapour in the
atmosphere absorbs much of the incoming long wave infrared and
shortwave microwave radiation. The presence of water vapour in the lower
atmosphere varies greatly from location to location and at different times of
the year. The air mass above a desert would have very little water vapour
to absorb energy, while the tropics would have high concentrations of water
vapour. Radiation that is not absorbed or scattered in the atmosphere can
reach and interact with the Earth's surface. There are three forms of
interaction that can take place when energy strikes or is incident upon the
surface. These are: absorption (A); transmission (T); and reflection (R).
Fig. 6.4 Interaction of EMR with Earth’s Surface Features and Atmosphere
54 | P a g e
Radiation from the sun, when incident upon the earth’s surface, is either
reflected by the surface, transmitted into the surface or absorbed and
emitted by the surface. The EMR, on interaction, experiences a number of
changes in magnitude, direction, wavelength, polarization and phase.
These changes are detected by the remote sensor and enable the
interpreter to obtain useful information about the object of interest. The
remotely sensed data contain both spatial information (size, shape and
orientation) and spectral information (tone, colour and spectral signature).
In the microwave region of the spectrum, the sensor is radar, which is an
active sensor, as it provides its own source of EMR. The EMR produced by
the radar is transmitted to the earth’s surface and the EMR reflected (back
scattered) from the surface is recorded and analyzed. The microwave
region can also be monitored with passive sensors, called microwave
radiometers, which record the radiation emitted by the terrain in the
microwave region.
6.2.1 Reflection
Of all the interactions in the reflective region, surface reflections are the
most useful and revealing in remote sensing applications. Reflection occurs
when a ray of light is redirected as it strikes a non-transparent surface. The
reflection intensity depends on the surface refractive index, absorption
coefficient and the angles of incidence and reflection.
Fig. 6.5 Different types of scattering surfaces (a) Perfect specular reflector
(b) Near perfect specular reflector (c) Lambertain (d) Quasi-Lambertian
(e) Complex
55 | P a g e
6.2.2 Transmission
Transmission of radiation occurs when radiation passes through a
substance without significant attenuation. For a given thickness, or depth of
a substance, the ability of a medium to transmit energy is measured as
transmittance.
6.2.3 Spectral Signature
Spectral reflectance is the ratio of reflected energy to incident energy as a
function of wavelength. Various materials of the earth’s surface have
different spectral reflectance characteristics. Spectral reflectance is
responsible for the colour or tone in a photographic image of an object.
Trees appear green because they reflect more of the green wavelength.
The values of the spectral reflectance of objects averaged over different,
well-defined wavelength intervals comprise the spectral signature of the
objects or features by which they can be distinguished.
56 | P a g e
Fig. 6.6 Interaction of EMR with Earth’s Surface
Let us Sum Up
The interaction between electromagnetic radiation and the Earth’s
atmosphere can be considered to have three components: refraction that
changes the direction of propagation of the radiation field due to density
differences between outer space and the atmosphere, scattering that
changes the direction of propagation of individual photons as they are
absorbed and re-emitted by gasses or aerosols or other atmospheric
constituents without changing wavelength, and absorption that convert
photons into vibrations in a molecule, energy which is re-emitted as one or
more photons with longer wavelength(s). The probability of reflection rather
than absorption happening is termed the reflectance of the surface, and it
depends on the material on the surface as well as the wavelength of the
incoming radiation. Each surface material has a unique ‘signature’ that
defines what proportion of radiation is reflected for each wavelength. For
example, water reflects a small amount of blue and green wavelengths, less
of the red wavelengths, and almost nothing in the infrared wavelengths.
Vegetation, on the other hand, reflected around half of all incoming infrared
radiation, except for specific wavelengths that are effectively absorbed by
liquid water in the leaves. These spectral signatures are commonly
portrayed as graphs, with wavelengths along the x-axis and reflectance
along the y-axis.
57 | P a g e
Glossary
Rayleigh scattering: Rayleigh scattering occurs when particles are very
small compared to the wavelength of the radiation. These could be particles
such as small specks of dust or nitrogen and oxygen molecules. Rayleigh
scattering causes shorter wavelengths of energy to be scattered much more
than longer wavelengths.
Mie scattering: Mie scattering occurs when the particles are just about the
same size as the wavelength of the radiation. Dust, pollen, smoke and water
vapour are common causes of Mie scattering which tends to affect longer
wavelengths than those affected by Rayleigh scattering. Mie scattering
occurs mostly in the lower portions of the atmosphere where larger particles
are more abundant and dominates when cloud conditions are overcast.
Nonselective scattering: This occurs when the particles are much larger
than the wavelength of the radiation. Water droplets and large dust particles
can cause this type of scattering. Nonselective scattering gets its name from
the fact that all wavelengths are scattered about equally.
Suggested Readings
1. Campbell J., 1989. Introduction to Remote Sensing, Guilford, New
York.
2. Thomas M., Lillesand, Ralph, Kefer, W., 1994. Remote Sensing and
Image Interpretation, John Wiley & Sons, New York.
3. Campbell J., 1989. Introduction to Remote Sensing, Guilford, New
York.
4. Thomas M., Lillesand, Ralph, Kefer, W., 1994. Remote Sensing and
Image Interpretation, John Wiley & Sons, New York.
58 | P a g e
5. Sarkar, A. (2015): Practical Geography: A Systematic Approach.
Orient Black Swan Private Ltd., New Delhi.
6. Anji Reddy, M. (2008): Textbook of Remote Sensing and Geographic
Information System, B.S. Publication, Hyderabad.
7. Campbell, J. B., (2007): Introduction to Remote Sensing, Guildford
Press.
59 | P a g e
BLOCK 3
60 | P a g e
Unit 7
Platforms, Types of Platforms, and
its Characteristics
Structure
Overview
Learning Objectives
61 | P a g e
Overview
Remote sensing platforms can be defined as the structures or vehicles on
which remote sensing instruments (sensors) are mounted. For remote
sensing applications, sensors should be mounted on suitable stable
platforms. These platforms can be ground based air borne or space borne
based. As the platform height increases the spatial resolution and
observational area increases. Thus, the higher the sensor is mounted; the
larger the spatial resolution and synoptic view is obtained. The types or
characteristics of platform depend on the type of sensor to be attached and
its application. Platforms for remote sensors may be situated on the ground,
on an aircraft or balloon (or some other platform within the Earth's
atmosphere), or on a spacecraft or satellite outside of the Earth's
atmosphere. Typical platforms are satellites and aircraft, but they can also
include radio-controlled aeroplanes, balloons kites for low altitude remote
sensing, as well as ladder trucks or 'cherry pickers' for ground
investigations.
Learning Objectives
After Learning this lesson, you will be able to:
• Understand the Remote Sensing Platforms
62 | P a g e
Portable masts are also available in various forms and can be used to
support cameras and sensors for testing. The main problem with these
masts is that of stabilizing the platform, particularly in windy conditions.
Permanent ground platforms like towers and cranes are used for monitoring
atmospheric phenomenon and long-term monitoring of terrestrial features.
Towers can be built on site and can be tall enough to project through a forest
canopy so that a range of measurements can be taken from the forest floor,
through the canopy and from above the canopy.
63 | P a g e
Fig. 7.2 Balloon as platform
7.1.3 Aircraft Platform
Aerial platforms are primarily stable wing aircraft. Helicopters are also
occasionally used for this purpose. Generally, aircraft are used to collect
very detailed images. Helicopters can be for pinpointing locations, but they
vibrate and lacks stability.
64 | P a g e
b. High Altitude Aircraft
It is more stable and operates above 30,000 ft. High altitude aircraft includes
jet aircraft with good rate of climb, maximum speed, and high operating
ceiling. It acquires imagery for large areas (smaller scale). Examples are
NHAP, NAPP, AVIRIS. Aircraft platforms acquire imagery under suitable
weather conditions. It controls platform variables such as altitude. Time of
coverage can also be controlled.
65 | P a g e
Fig. 7.5 Rocket as Platform
7.1.5 Spacecraft as Platform
Remote sensing is also conducted from the space shuttle or artificial
satellites. Artificial satellites are manmade objects, which revolve around
another object. The 1960s saw the primary platform used to carry remotely
sensed instruments shifted from airplanes to satellite. Satellites can cover
much more land space than planes and can monitor areas on a regular
basis. Beginning with the first television and infrared observation Satellite in
1960, early weather satellites returned rather poor views of cloud patterns
and almost indistinct images of the earth’s surface.
66 | P a g e
Space photography has become better and was further extended with the
Apollo program. Then in 1973, Skylab the first American space workshop
was launched, and its astronauts took over 35,000 images of the earth with
the earth Resources experiment Package on board. Later on, with
LANADSAT and SPOT Satellite program, space photography received a
higher impetus.
7.2 Sensors
A sensor is a device that gathers energy (EMR or other), converts it into a
signal and presents it in a form suitable for obtaining information about the
target under investigation. According to Jensen (2000), remote sensors are
mechanical devices, which collect information, usually in storable form,
about objects or scenes, while being at some distance from them. Sensors
used for remote sensing can be either those operating in Optical Infrared
(OIR) region or those operating in the microwave region. Depending on
the source of energy, sensors are categorized as active or passive:
7.2.1 Active Sensors
Active sensors are those, which have their own source of EMR for
illuminating objects. Radar (Radio Detection and Ranging) and Lidar (Light
Detection and Ranging) are some examples of active sensors. A
photographic camera becomes an active sensor when used with a flash
bulb. Radar is composed of a transmitter and a receiver. The transmitter
emits a wave, which hits objects in the environment and gets reflected or
echoed back to the receiver. The main advantage is that active sensors can
obtain imagery in wavebands where natural signal levels are extremely low
and also are independent of natural illumination. The major disadvantage
with active sensor is that it needs high energy levels, therefore adequate
inputs of power is necessary.
7.2.2 Passive Sensors
Passive sensors do not have their own source of energy. These sensors
receive solar electromagnetic energy reflected from the surface or energy
emitted by the surface itself. Therefore, except for thermal sensors they
cannot be used at nighttime. Thus, in passive sensing, there is no control
over the source of electromagnetic radiation. Photographic cameras
(without the use of bulb), multispectral scanners, vidicon cameras etc. are
examples of passive remote sensors. The advantage with passive sensor
is that it is simple and does not require high power. The disadvantage is that
during bad weather conditions the passive sensors do not work. The
67 | P a g e
Thematic Mapper (TM) sensor system on the Landsat satellite is a passive
sensor.
68 | P a g e
temperatures of land and sea, and status of volcanic activities and forest
fires.
7.3.6 Hyper Spectral Imaging System
Hyper Spectral imaging system records the radiation of terrain in 100s of
narrow spectral bands. Therefore, the spectral signature of an object can
be achieved accurately, helps in object identification more precisely.
Example, Hyperion data is recorded in 242 spectral bands and AVIRIS data
is recorded in 224 spectral bands.
7.3.7 Microwave Sensors
These types of sensors receive microwaves, which have longer
wavelengths than visible light and infrared rays. The observation is not
affected by day, night or weather. The microwave portion of the spectrum
includes wavelengths within the approximate range of 1 mm to1m. The
longest microwaves are about 2,500,000 times longer than the shortest light
waves. There are two types of observation methods using microwave
sensor: a) Active sensor- The sensor emits microwaves and observes
microwaves reflected by land surface features. It is used to observe
mountains, valleys, surface of oceans wind, wave and ice conditions and b)
Passive sensor- This type of sensor records microwaves that naturally
radiated from earth surface features.
1. Across-track scanning
2. Along-track scanning system
3. Side looking or oblique scanning system (Radar)
7.4.1 Across-Track Scanners Sensors and Platforms:
This scanning system makes use of a faceted mirror that is rotated by an
electric motor with the horizontal axis of rotation parallel to the direction
of flight. The mirror scans the landscape in a pattern of parallel scan lines
at right angles to the path of the airborne platform. The mirrors transmit
energy that are reflected or radiated from the ground onto the detector. This
type of scanner is also known as a whisk broom scanner system. The energy
flux, sensor altitude, detector spectral bandwidth, IFOV, and dwell time are
69 | P a g e
all aspects that affect the strength of the sensor signal produced by a
detector since across-track scanners have a short dwell time the detector
receives less energy and generate a weak signal. The Multispectral
Scanner (MSS) and Thematic Mapper (TM) of the Landsat series of
satellites are examples of across-track scanners.
70 | P a g e
the reflections of pulses carried out by radar equipped aircraft and satellites.
SLAR (Side Looking Airborne Radar) is a common kind of remote sensing
technique used to obtain radar images of the terrain. SLAR's primary
components include an antenna, a duplexer, a transmitter, a receiver, a
pulse-generating device, and a cathode ray tube.
Optical imaging sensor: Optical imaging sensors use the visible and
reflected IR bands. Optical imaging systems utilised on space platforms
include panchromatic, multispectral, and hyper spectral systems.
Thermal IR imaging sensor: A thermal sensor operates in the
electromagnetic spectrum between 9 and 14 μm, generally in the mid-to-
far-infrared and microwave ranges. By emitting infrared radiation, any object
with a temperature over zero can create a thermal image.
Radar imaging sensor: A radar (microwave) imaging sensor is often an
active sensor that operates in the electromagnetic spectrum between 1 mm
-1 m. The sensor sends light to the ground, and the target reflects the energy
back to the radar antenna, creating a microwave image. The radar follows
a flight path, and the radar's lighted area, or footprint, travels across the
surface in a swath.
7.5 2 non-imaging sensors:
A profile recorder is a non-imaging sensor that measures a signal
depending on the intensity of the full field of vision. This type of sensor does
not store information about how the input varies over time. Non-imaging
sensors used in remote sensing include radiometers, altimeters,
spectrometers, spectro radiometers, and LIDAR.
71 | P a g e
Radiometer: A radiometer is any piece of equipment that quantitatively
measures electromagnetic radiation in a specific range of the
electromagnetic spectrum.
Spectrometer: A spectrometer is a sensor with a component, such as a
prism or diffraction grating, that may break a portion of the spectrum into
discrete wavelengths and scatter (or separate) them at different angles to
an array of detectors.
Spectro-radiometer: Spectro-radiometers are sensors that gather diffused
radiation in bands rather than specific wavelengths. The most popular
air/space sensors are spectro radiometers.
Let us Sum Up
Platform is the vehicle or carrier for remote sensors, from which a sensor
can be operated. Weather Surveillance Radar is of the long-range type
which detects and tracks typhoons and cloud masses at distance of 400
kilometres or less. The radar is a useful tool in tracking and monitoring
tropical cyclones. Active Sensors are the sensors that detect reflected
responses from objects which are irradiated from artificially generated
energy sources. A spectrometer is a sensor with a component, such as a
prism or diffraction grating, that may break a portion of the spectrum into
discrete wavelengths and scatter (or separate) them at different angles to
an array of detectors.
Glossary
The GPS (Global Positioning System) is a satellite-based navigation system
with at least 24 satellites. Anywhere in the globe, GPS works in any weather
condition.
A rover is a planetary surface exploration device designed to move on the
solid surface of a planet or other celestial body of planetary mass.
72 | P a g e
electromagnetic radiation in a specific range of the electromagnetic
spectrum.
Suggested Readings
1. Campbell J., 1989. Introduction to Remote Sensing, Guilford, New
York.
2. Thomas M., Lillesand, Ralph, Kefer, W., 1994. Remote Sensing and
Image Interpretation, John Wiley & Sons, New York.
Web Sources
1. http://www.drbrambedkarcollege.ac.in/sites/default/files/Remote%20
sensing%20platforms.pdf
2. https://eos.com/blog/types-of-remote-sensing/
73 | P a g e
Unit 8
Active and Passive, Optical-Mechanical
Scanners and Push-Broom Scanners
Structure
Overview
Learning Objectives
Overview
This scanning system makes use of a faceted mirror that is rotated by an
electric motor with the horizontal axis of rotation parallel to the direction of
flight. The mirror scans the landscape in a pattern of parallel scan lines at
right angles to the path of the airborne platform. The mirrors transmit energy
that are reflected or radiated from the ground onto the detector. This type of
scanner is also known as a whisk broom scanner system. The energy flux,
sensor altitude, detector spectral bandwidth, IFOV, and dwell time are all
aspects that affect the strength of the sensor signal produced by a detector
since across-track scanners have a short dwell time the detector receives
less energy and generate a weak signal.
Learning Objectives
After Learning this lesson, you will be able to:
• Understand the Active and Passive Scanner
74 | P a g e
• Acquire the Knowledge of Optical-Mechanical Scanners and Push-
Broom Scanners
75 | P a g e
The function of the elements of an optical mechanical scanner are as
follows.
a. Optical system: Reflective telescope system such as Newton,
Cassegrain or Ritchey-Chretien is used to avoid colour aberration.
b. Spectrographic system: Dichroic mirror, grating, prism or filter are
utilized.
76 | P a g e
Compared to the pushbroom scanner, the optical mechanical scanner has
certain advantages. For examples, the view angle of the optical system can
be very narrow, band to band registration error is small and resolution is
higher, while it has the disadvantage that signal to noise ratio (S/N) is rather
less because the integration time at the optical detector cannot be very long
due to the scanner motion.
77 | P a g e
One embodiment of the scanning device includes, in order along the
direction of a path of a mean incident beam from the field of vision, an
objective, a raster scanning mirror for scanning in the y direction, a field
mirror which delimits the field of the objective in the x direction, a rotating
drum and an image transport system for line scanning in the x direction, the
field mirror deflecting the beams towards the drum, and a detector sensitive
to the radiation contained in the beams, the scanning device ensuring
convergence of the beams at the detector. A push broom scanner, also
known as an along-track scanner, is a device for obtaining images
with spectroscopic sensors. The scanners are regularly used for
passive remote sensing from space, and in spectral analysis on production
lines, for example with near-infrared spectroscopy used to identify
contaminated food and feed. The moving scanner line in a traditional
photocopier is also a familiar, everyday example of a push broom scanner.
Push broom scanners and the whisk broom scanners variant are often
contrasted with staring arrays (such as in a digital camera), which image
objects without scanning, and are more familiar to most people.
Glossary
Optical system: Reflective telescope system such as Newton, Cassegrain
or Ritchey-Chretien is used to avoid colour aberration.
Spectrographic system: Dichroic mirror, grating, prism or filter are utilized.
Scanning system: rotating mirror or oscillating mirror is used for scanning
perpendicular to the flight direction.
Detector system: Electromagnetic energy is converted to an electric signal
by the optical electronic detectors. Photomultiplier detectors utilized in the
near ultraviolet and visible region, silicon diode in the visible and near
infrared, cooled ingium antimony (InSb) in the short-wave infrared, and
thermal barometer or cooled Hq Cd Te in the thermal infrared.
Reference system: The converted electric signal is influenced by a change
of sensitivity of the detector. Therefore, light sources or thermal sources with
constant intensity or temperature should be installed as a reference for
calibration of the electric signal.
78 | P a g e
perpendicular to the flight direction.
2. What is Push-Broom Scanners
The push broom scanner has a linear array of detectors, in which each
detector measures the radiation reflected from a small area on the
ground. In this type of scanning system, linear array of detectors scan
in the direction parallel to the flight line.
Suggested Readings
1. Campbell J., 1989. Introduction to Remote Sensing, Guilford, New
York.
2. Thomas M., Lillesand, Ralph, Kefer, W., 1994. Remote Sensing and
Image Interpretation, John Wiley & Sons, New York.
3. Campbell J., 1989. Introduction to Remote Sensing, Guilford, New
York.
4. Thomas M., Lillesand, Ralph, Kefer, W., 1994. Remote Sensing and
Image Interpretation, John Wiley & Sons, New York.
5. Sarkar, A. (2015): Practical Geography: A Systematic Approach.
Orient Black Swan Private Ltd., New Delhi.
6. Anji Reddy, M. (2008): Textbook of Remote Sensing and Geographic
Information System, B.S. Publication, Hyderabad.
Web Sources
1. http://www.drbrambedkarcollege.ac.in/sites/default/files/Remote%20
sensing%20platforms.pdf
2. https://eos.com/blog/types-of-remote-sensing/
79 | P a g e
Unit 9
Thermal Remote Sensing and Ideal Remote
Sensing Systems
Structure
Overview
Learning Objectives
Overview
The earth-atmosphere system derives its energy from the sun which, being
at a very high temperature, radiates maximum energy in the shorter
wavelengths (visible, 0.20 to 0.80 mm). The earth-atmosphere system
absorbs part of this energy (part due to its reflective properties due to
surface albedo, clouds, and other reflectors/scatterers in the atmosphere),
which in turn heats it up and raises its temperature. This temperature, being
in the range of 300 degrees Kelvin, will emit its own radiation in the longer
wavelengths called 'thermal infrared'. The observations in the thermal
wavelength of the electromagnetic spectrum (3-35 mm) are generally
referred to as thermal remote sensing. In this region the radiation emitted
by the earth due to its thermal state are far more intense than the solar
80 | P a g e
reflected radiation, therefore any sensor operating in this wavelength region
would primarily detect the thermal radiative properties of ground material.
All materials having a temperature above absolute zero (-273°C or 0°K) both
day and night emit Infrared energy. Infrared sensing refers to the detection
of remote objects by recording the amount of infrared energy emitted from
various surfaces as a continuous tone image on photographic film. Thermal
IR imagery is usually obtained in the wavelength regions 3 to 5.5mm and
from 8 to 14mm because of atmospheric absorption at other wavelengths.
Learning Objectives
After Learning this lesson, you will be able to:
• Understand the Thermal Remote Sensing
• Acquire the Knowledge of Ideal Remote Sensing
81 | P a g e
near infrared (NIR) and in extreme cases even the visible region of the EM
spectrum.
Thermal remote sensing, in principle, is different from remote sensing in the
optical and microwave region. In practice, thermal data proves to be
complementary to other remote sensing data. Thus, though still not fully
explored, thermal remote sensing reserves potential for a variety of
applications.
9.1.1 Thermal Atmospheric Windows
While Thermal IR region extends from 3-14 μm, only portions of the
spectrum are suitable for remote sensing applications. There are several
atmospheric windows in the thermal portion of the spectrum, but none of the
windows transmits 100 % of the emitted radiation. Water vapor and carbon
dioxide absorb some of the energy across the spectrum and ozone absorbs
energy specifically in the 10.5-12.5 μm range. The gases and particles in
the atmosphere also absorb incoming radiation and emit their own thermal
energy. Most thermal sensing is performed in the 8-14 μm region of the
spectrum not only because it includes an atmospheric window, but because
it contains the peak energy emissions for most of Earth’s surface features.
9.1.2 Wavelength / Spectral Range
The infrared portion of the electromagnetic spectrum is usually considered
to be from 0.7 to 1,000 µm. Within this infrared portion, there are various
nomenclatures and little consensus among various groups to define the sub
boundaries. In terrestrial remote sensing the region of 3 to 35 µm is
popularly called thermal infrared. As in all other remote sensing missions,
data acquisitions are made only in regions of least spectral absorption
known as the atmospheric windows. Within the thermal infrared an excellent
atmospheric window lies between 8-14 µm wavelength. Poorer windows lie
in 3-5 µm and 17-25 µm. Interpretation of the data in 3-5 µm is complicated
due to overlap with solar reflection in day imagery and 17-25 µm region is
still not well investigated. Thus 8-14 µm region has been of greatest interest
for thermal remote sensing.
9.1.3 Spectral Emissivity and Kinetic Temperature
Thermal remote sensing exploits the fact that everything above absolute
zero (0 K or -273.15 °C or –459 °F) emits radiation in the infrared range of
the electromagnetic spectrum. How much energy is radiated, and at which
wavelengths, depends on the emissivity of the surface and on its kinetic
temperature. Emissivity is the emitting ability of a real material compared to
that of a black body and is a spectral property that varies with composition
82 | P a g e
of material and geometric configuration of the surface. Emissivity denoted
by epsilon (ε) is a ratio and varies between 0 and 1. For most natural
materials, it ranges between 0.7 and 0.95. Kinetic temperature is the surface
temperature of a body/ground and is a measure of the amount of heat
energy contained in it. It is measured in different units, such as in Kelvin (K);
degrees Centigrade (°C); degrees Fahrenheit (°F).
9.2 Emissivity
Objects in the real world are not perfect blackbodies. Not all of the incident
energy upon them is absorbed, therefore they are not perfect emitters of
radiation. The emissivity (ε) of a material is the relative ability of its surface
to emit heat by radiation. Emissivity is defined as the ratio of the energy
radiated from an object's surface to the energy radiated from a blackbody
at the same temperature.
Emissivity values can range from 0 to 1. A blackbody has an emissivity of
1, while a perfect reflector or whitebody has an emissivity of 0. Most natural
objects are considered "graybodies" as they emit a fraction of their
maximum possible blackbody radiation at a given temperature. Water has
an emissivity close to 1 and most vegetation also has an emissivity close to
1. Many minerals and metals have emissivities significantly less than 1.
Depending on the material, emissivity can also vary depending on its
temperature. Below are emissivities for some common materials.
Material Emissivity
Concrete 0.92
Glass 0.92
Gypsum 0.08
Ice 0.97
Sand 0.9
83 | P a g e
Snow 0.8
Water 0.95
The emissivity of a surface depends not only on the material but also on the
nature of the surface. For example, a clean and polished metal surface will
have a low emissivity, whereas a roughened and oxidized metal surface will
have a high emissivity. Two materials lying next to one another on the
ground could have the same true kinetic temperature but have different
apparent radiant temperatures when sensed by a thermal radiometer simply
because their emissivities are different. Emissivity can be used to identify
mineral composition. Knowledge of surface emissivity is also important for
both accurate true kinetic temperature measurements from radiometers.
Many materials (graybodies) have an emissivity less than 1 and this
emissivity is constant across all wavelengths (see above graph). For any
given wavelength the emitted energy of a graybody is a fraction of that of a
blackbody. The emissivity of some objects varies depending on the
wavelength. These objects are referred to as selective radiators or as being
selectively radiant. The emissivity of such materials can vary greatly
depending on the wavelength. Some materials may behave like blackbodies
at certain wavelengths (ε close to 1) but may have reduced emissivity at
other wavelengths. The below graph shows how the emissivity varies
across the wavelengths for two materials, quartz, and feldspar. Both
materials are selective radiators, but quartz has considerably more variation
in emissivity at different wavelengths.
84 | P a g e
systems are multispectral, meaning they collect data on emitted radiation
across a variety of wavelengths.
9.3.1 Thermal Sensors
a. Thermal Infrared Multispectral Scanner (TIMS)
NASA and the Jet Propulsion Laboratory developed the Thermal Infrared
Multispectral Scanner (TIMS) for exploiting mineral signature information.
TIMS is a multispectral scanning system with six different bands ranging
from 8.2 to 12.2 μm and a spatial resolution of 18m. TIMS is mounted on an
aircraft and was primarily designed as an airborne geologic remote sensing
tool. TIMS acquires mineral signature data that permits the discrimination of
silicate, carbonate and hydrothermally altered rocks. TIMS data have been
used extensively in volcanology research in the western United States,
Hawaiian Islands and Europe. Multispectral data allows for the generating
of three-band color composites like other multispectral data. Many materials
have varying emissivities and can be identified by the variation in emitted
energy.
The thermal image to the right was captured the Thermal Infrared
Multispectral Scanner (TIMS) and is a thermal image of Death Valley
California. A color composite has been produced using three thermal bands
collected by TIMS. There are a variety of different materials and minerals in
Death Valley with varying emissivities. In this image Thermal Band 1 (8.2 -
8.6μm) is displayed in blue, Thermal Band 3 (9.0 - 9.4μm) is displayed in
green and Thermal Band 5 (10.2 - 11.2 μm) is displayed in red. Alluvial fans
appear in shades of reds, lavender, and blue greens; saline soils in yellow;
and different saline deposits in blues and greens.
b. Advanced Spaceborne Thermal Emission and Reflection
Radiometer (ASTER)
85 | P a g e
data with 1000m spatial resolution. MODIS has high temporal resolution
with a one-to-two-day return time. This makes it an excellent resource for
detecting and monitoring wildfires. One of the products generated from
MODIS data is the Thermal Anomalies/Fire product which detects hotspots
and fires.
d. Landsat
A variety of the Landsat satellites have carried thermal sensors. The first
Landsat satellite to collect thermal data was Landsat 3, however this part of
the sensor failed shortly after the satellite was launched. Landsat 4 and 5
included a single thermal band (band 6) on the Thematic Mapper (TM)
sensor with 120m spatial resolution that has been resampled to 30m. A
similar band was included on the Enhanced Thematic Mapper Plus (ETM+)
on Landsat 7. Landsat 8 includes a separate thermal sensor known the
Thermal Infrared Sensor (TIRS). TIRS has two thermal bands, Band 10
(10.60 - 11.19μm) and Band 11 (11.50 - 12.51μm). The TIRS bands are
acquired at 100 m spatial resolution but are resampled to 30m in the
delivered data products.
e. Landsat TIRS and Applications
Irrigation accounts for 80% of freshwater use in the U.S and water usage
has become an increasingly important issue, particularly in the West.
Thermal infrared data from Landsat 8 is being used to estimate water use.
Landsat 8 data, including visible, near infrared, mid-infrared, and thermal
data are fed into a relatively sophisticated energy balance model that
produces evapotranspiration maps. Evapotranspiration (ET) refers to the
conversion of water into water vapor by the dual process of evaporation
from the soil and transpiration (the escape of water though plant’s stomata).
For vegetated land, ET is synonymous with water consumption. Landsat
data enable water resources managers and administrators to determine
how much water was consumed from individual fields.
86 | P a g e
A Series of Unique Energy/Matter Interactions at the Earth's Surface which
generate reflected and/or emitted signals that are selective with respect to
wavelength and unique to each object or earth surface feature type.
A Super Sensor which is highly sensitive to all wavelengths. A super sensor
would be simple, reliable, accurate, economical, and requires no power or
space. This sensor yields data on the absolute brightness from a scene as
a function of wavelength.
A Real-Time Data Handling System which generates the instance radiance
versus wavelength response and processes into an interpretable format in
real time. The data derived is unique to a particular terrain and hence
provides insight into its physical chemical-biological state.
Multiple Data Users having knowledge in their respective disciplines and in
remote sensing data acquisition and analysis techniques. The information
collected will be available to them faster and at less expense. This
information will aid the users in various decision-making processes and
further in implementing these decisions.
Let us Sum Up
Thermal remote sensing, in principle, is different from remote sensing in the
optical and microwave region. In practice, thermal data proves to be
complementary to other remote sensing data. Thus, though still not fully
explored, thermal remote sensing reserves potential for a variety of
applications. In terrestrial remote sensing the region of 3 to 35 µm is
popularly called thermal infrared. As in all other remote sensing missions,
data acquisitions are made only in regions of least spectral absorption
known as the atmospheric windows. Emissivity is the emitting ability of a
87 | P a g e
real material compared to that of a black body and is a spectral property that
varies with composition of material and geometric configuration of the
surface.
Glossary
A Uniform Energy Source which provides energy over all wavelengths, at a
constant, known, high level of output.
A Non-interfering Atmosphere which will not modify either the energy
transmitted from the source or emitted (or reflected) from the object in any
manner.
A Series of Unique Energy/Matter Interactions at the Earth's Surface which
generate reflected and/or emitted signals that are selective with respect to
wavelength and unique to each object or earth surface feature type.
Suggested Readings
1. Campbell J., 1989. Introduction to Remote Sensing, Guilford, New
York.
2. Thomas M., Lillesand, Ralph, Kefer, W., 1994. Remote Sensing and
Image Interpretation, John Wiley & Sons, New York.
3. Campbell J., 1989. Introduction to Remote Sensing, Guilford, New
York.
4. Thomas M., Lillesand, Ralph, Kefer, W., 1994. Remote Sensing and
Image Interpretation, John Wiley & Sons, New York.
5. Sarkar, A. (2015): Practical Geography: A Systematic Approach.
Orient Black Swan Private Ltd., New Delhi.
88 | P a g e
6. Anji Reddy, M. (2008): Textbook of Remote Sensing and Geographic
Information System, B.S. Publication, Hyderabad.
7. Chauniyal, D.D., (2010): Sudur Samvedanevam Bhogolik Suchana
Pranali (Hindi), Sharda Pustak Bhawan, Allahabad. 34.
Web Sources
http://gsp.humboldt.edu/olm/Courses/GSP_216/lessons/thermal/sensors.h
tml
89 | P a g e
BLOCK 4
90 | P a g e
Unit 10
Aerial Photo Imaging System and Types of
Aerial Photographs
Structure
Overview
Learning Objectives
10.1 Aerial Photo Imaging System
10.2 Geometry of an Aerial Photograph
10.3 Scales of Aerial Photograph
Overview
The geometry of an aerial photograph is based on the simple, fundamental
condition of collinearity. Three or more points that lie on the same line are
said to be collinear. In photogrammetry, a single ray of light is the straight
line; three fundamental points must always fall on this straight line: the
imaged point on the ground, the focal point of the camera lens, and the
image of the point on the film or imaging array of a digital camera. The length
of each ray, from the focal point of the camera to the imaged point on the
ground, is determined by the height of the camera lens above the ground
and the elevation of that point on the ground.
Learning Objectives
After Learning this lesson, you will be able to:
• Measure Photo Coordinates and Relate them to Ground
Coordinates
• Know about Scales of Aerial Photograph
91 | P a g e
• Understand Aerial Photographs and their Types.
• Acquire the Knowledge and Compare Between the Types of Aerial
Photograph
92 | P a g e
Orthogonal Projection: This is a special case of parallel projections. Maps
are orthogonal projections of the ground. The advantage of this projection
is that the distances, angles or areas on the plane are independent of the
elevation differences of the objects. This is an example of orthogonal
projection where the projecting rays are perpendicular to the line LL1 (Fig.
10.2)
93 | P a g e
perpendicular distance between the camera lens and the ground
photographed is known as the flying height.
94 | P a g e
Medium Scale Photographs: Aerial photographs with a scale ranging
between 1:15,000 and 1:30,000 are usually treated as medium scale
photographs.
Small Scale Photographs: The photographs with the scale being smaller
than 1: 30,000, are referred to as small scale photographs. Cover large
areas in less detail. A small-scale photo simply means that ground features
are at a smaller, less detailed size. The area of ground coverage that is seen
in the photo is greater than at larger scales.
Scale: the ratio of the distance between two points on a photo to the actual
distance between the same two points on the ground (i.e. 1 unit on the photo
equals "x" units on the ground). If a 1 km stretch of highway covers 4 cm on
an air photo, the scale is calculated as follows:
𝑃𝑃ℎ𝑜𝑜𝑜𝑜𝑜𝑜 𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑/ 𝐺𝐺𝐺𝐺𝐺𝐺𝐺𝐺𝐺𝐺𝐺𝐺 𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑 = 4 𝑐𝑐𝑐𝑐/1 𝑘𝑘𝑘𝑘 = 4 𝑐𝑐𝑐𝑐/100000 𝑐𝑐𝑐𝑐 = 1/25000
So the scale is: 1/25000
The second method used to determine the scale of a photo is to find the
ratio between the camera's focal length and the plane's altitude above the
ground being photographed.
If a camera's focal length is 152 mm, and the plane's altitude Above Ground
Level (AGL) is 7 600 m, using the same equation as above, the scale would
be: 𝐹𝐹𝐹𝐹𝐹𝐹𝐹𝐹𝐹𝐹 𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙ℎ/𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴 = 152 𝑚𝑚𝑚𝑚/7600 𝑚𝑚 = 152 𝑚𝑚𝑚𝑚/57600000 𝑚𝑚𝑚𝑚 =
1/50000 So the scale is: 1/50000
Third methods of Scale of the photograph can also be calculated if we know
focal length of camera and height of aircraft above the ground level.
Scale = f/H-h
95 | P a g e
Where, H = flying height of aircraft above sea level, h = height of ground
above sea level and f is focal length.
If a camera's focal length is 152 mm, and the plane's altitude Above Ground
Level (AGL) is 7 600 m, using the same equation as above, the scale would
be:
Vertical Photographs:
While taking aerial photographs, two distinct axes are formed from the
camera lens centre, one towards the ground plane and the other towards
the photo plane. The perpendicular dropped from the camera lens centre to
the ground plane is termed as the vertical axis, whereas the plumb line
drawn from the lens centre to the photo plane is known as the
photographic/optical axis. When the photo plane is kept parallel to the
ground plane, the two axes also coincide with each other. The photograph
so obtained is known as vertical aerial photograph. However, it is normally
very difficult to achieve perfect parallelism between the two planes because
the aircraft flies over the curved surface of the earth. The photographic axis,
therefore, deviates from the vertical axis. If such a deviation is within the
range of plus or minus 3o, the near-vertical aerial photographs are obtained.
Any photography with an unintentional deviation of more than 3o in the
optical axis from the vertical axis is known as a tilted photograph (Fig. 10.5).
96 | P a g e
Fig. 10.5 Vertical Aerial photograph
Low Oblique: An aerial photograph taken with an intentional deviation of 15°
to 30° in the camera axis from the vertical axis is referred to as the low
oblique photograph. This kind of photograph is often used in
reconnaissance surveys (Fig. 10.6).
97 | P a g e
High Oblique: The high oblique are photographs obtained when the camera
axis intentionally inclined about 60° from the vertical axis. Such photography
is useful in reconnaissance surveys (Fig. 10.7).
98 | P a g e
Difference in Relatively
Comparison to Greater
Least Greatest
the Map
Advantages Useful in
Topographical
Reconnaissance Illustrative
and thematic
Survey
mapping
Let us Sum Up
A photographic image is a central perspective. Here implies that every light
ray, which reaches the film surface during exposure, passed through the
camera lens. Photography implies the position of all the points is controlled
by one single point of the image, which controls the geometry of the entire
photographs. Principal point (PP)- in the point on the image where the
optical axis intersects the image plane. The Optical axis is an imaginary line
that passes through the optical center of the lens and perpendicular to the
film or image plane. The Distance between the perspective center and
principal point determines the Focal length.
Glossary
Vertical Photographs: While taking aerial photographs, two distinct axes are
formed from the camera lens centre, one towards the ground plane and the
other towards the photo plane.
Low Oblique: An aerial photograph taken with an intentional deviation of 15°
to 30° in the camera axis from the vertical axis is referred to as the low
oblique photograph. This kind of photograph is often used in
reconnaissance surveys.
High Oblique: The high oblique are photographs obtained when the camera
axis intentionally inclined about 60° from the vertical axis. Such photography
is useful in reconnaissance surveys.
99 | P a g e
• Vertical photographs
• Low oblique photographs
• High oblique photographs
Suggested Readings
1. Campbell J., 1989. Introduction to Remote Sensing, Guilford, New
York.
2. Thomas M., Lillesand, Ralph, Kefer, W., 1994. Remote Sensing and
Image Interpretation, John Wiley & Sons, New York.
3. Campbell J., 1989. Introduction to Remote Sensing, Guilford, New
York.
4. Thomas M., Lillesand, Ralph, Kefer, W., 1994. Remote Sensing and
Image Interpretation, John Wiley & Sons, New York.
5. Sarkar, A. (2015): Practical Geography: A Systematic Approach.
Orient Black Swan Private Ltd., New Delhi.
6. Anji Reddy, M. (2008): Textbook of Remote Sensing and Geographic
Information System, B.S. Publication, Hyderabad.
Web Sources
1. http://gsp.humboldt.edu/olm/Courses/GSP_216/lessons/thermal/senso
rs.html
100 | P a g e
Unit 11
Marginal Information of Aerial Photographs
Structure
Overview
Learning Objectives
11.1 Marginal Information of Aerial Photographs
Overview
The word `Photography` means `writing with Light. Aerial photograph
means taking pictures of the earth from air. A photograph is the signature
of energy emitted or reflected by air object on photographic films. The
photographic interpretation is “the act of examining photographic images for
the purpose of identifying objects and judging their significance.” The most
important principle of photo interpretation is the observation and secondly
the capacity to use logical modes of thought to draw correct conclusion from
the things observed. Aerial photographs provide a unique tool. They cover
a large area on earth’s surface. Overlapping pairs of photographs provide a
101 | P a g e
three-dimensional view of the object photographed. Images on aerial
photographed are permanent and unbiased representation of objects
occurring on earth surface. The large area photographed enables a photo
interpreter to perceive relations between objects and their background.
Learning Objectives
After Learning this lesson, you will be able to:
• Know about Marginal Information of Aerial Photograph
• Understand Aerial Photographs
102 | P a g e
Fig. 11.1 Marginal Information given on Vertical Aerial Photographs
(Credit: NCERT)
A: Fiducial Marks B: Photo Specifications
C: Tilt Indicator D: Flying Height Indicator
793 is a Photo Specification number maintained by the 73 APFPS Party of
the Survey of India. B is the Flying Agency that carried out the present
photography (In India three flying agencies are officially permitted to carry
out aerial photography. They are the Indian Air Force, the Air Survey
Company, Kolkata, and the National Remote Sensing Agency, Hydrabad,
identified on the aerial photographs as A, B and C respectively), 5 is the
strip number and 23 is the photo number in strip 5.
11.1.1 Fiducial Marks:
A fiducial marker or fiducial is an object placed in the field of view of
an imaging system that appears in the image produced, for use as a point
of reference or a measure. It may be either something placed into or on the
imaging subject, or a mark or set of marks in the reticle of an optical
instrument.
103 | P a g e
Fiducial marks are small registration marks exposed on the edges of a
photograph. The distances between fiducial marks are precisely measured
when a camera is calibrated. They are helpful to locating Principal Point.
These marks are also called collimating marks.
104 | P a g e
towards East. When strips are North-South, they are numbered from South
towards North.
105 | P a g e
11.1.8 Transfer Point:
When we transfer the principal point on other aerial photograph, the point
does not come on photograph. It is on any object or building. Or find out the
principal point on other photograph with fiducial marks, than we make 3D
and after we transfer the point. Transfer Point does not come in the centre
but the Transfer Point will come in side.
11.1.9 Focal Length:
Focal Length is the distance between negative plane and optical centre. It
has shown on aerial photograph right side of watch. Generally, it is in
millimetre. As focal length increases, image distortion decreases. The focal
length is precisely measured when the camera is calibrated. The focal
Length helps for calculating the scale.
11.1.10 Altimeter:
Altimeter means height from above mean sea level. Average height also
must be any area. It has shown on aerial photograph left side of watch.
Let us Sum Up
Aerial photography or airborne imagery is the taking of photographs from
an aircraft or other flying object. Aerial photography is used in cartography
(particularly in photogrammetric surveys, which are often the basis for
topographic maps), land-use planning, archaeology, movie production,
environmental studies, power line inspection, surveillance, commercial
advertising, conveyancing, and artistic projects.
Glossary
Fiducial Marks: Fiducial marks are small registration marks exposed on the
edges of a photograph.
Photographic Number: Aerial Photographs are numbered serially along the
strip or flight direction.
Altimeter: Altimeter means height from above means sea level. Average
height also must be any area. It is shown on aerial photograph left side of
watch.
106 | P a g e
Suggested Readings
1. Campbell J., 1989. Introduction to Remote Sensing, Guilford, New
York.
2. Thomas M., Lillesand, Ralph, Kefer, W., 1994. Remote Sensing and
Image Interpretation, John Wiley & Sons, New York.
3. Campbell J., 1989. Introduction to Remote Sensing, Guilford, New
York.
4. Thomas M., Lillesand, Ralph, Kefer, W., 1994. Remote Sensing and
Image Interpretation, John Wiley & Sons, New York.
5. Sarkar, A. (2015): Practical Geography: A Systematic Approach.
Orient Black Swan Private Ltd., New Delhi.
6. Anji Reddy, M. (2008): Textbook of Remote Sensing and Geographic
Information System, B.S. Publication, Hyderabad.
Web Sources
1. https://www.ijsr.net/archive/v3i9/U0VQMTQ1MDM=.pdf
107 | P a g e
Unit 12
Elements of Photo Interpretation
Structure
Overview
Learning Objectives
12.1 Photo Interpretation
Overview
The analysis of remote sensing imagery includes the recognizable proof of
different targets in an image, and those targets may be environmental or
artificial features which comprise of points, lines, or areas. Targets reflect or
emit radiation. This radiation is estimated and recorded by a sensor, and at
last is depicted as an image like an air photo or a satellite image. An image
interpreter explores aerial or satellite imagery for the purpose of making use
of it, identifying the features, and evaluating their significance. Image
interpretation strategies were grown logically over 100 years, initially
focusing on military applications, and later reaching out to a wide range of
108 | P a g e
uses for scientific and commercial use. The process of extracting qualitative
and quantitative information about objects from aerial photographs or
satellite images is known as interpretation. Aerial photo interpretation is the
process of interpreting aerial photographs. One must rely totally on the
abilities of a human analyst, sometimes known as a photo interpreter.
Learning Objectives
After Learning this lesson, you will be able to:
• Know about Elements of Photo Interpretation
• Aerial Photographic Interpretation
• Basic Characteristics of Interpretation
• Analysis and deduction: Detect the spatial order of the objects and
predict the occurrence of certain relationships.
• Classification: To arrange the objects and elements identified into
an orderly system
• Accuracy determination: Field to confirm the interpretation.
Kinds and amounts of information that could be obtained from aerial
photographs depend primarily on
• Type of terrain
• Climatic environment
• Stage of the geomorphic cycle.
109 | P a g e
• Vertical Exaggeration
12.2.1 Photographic Tone
Measure of relative amount of light reflected by an object and recorded on
the photograph. It refers to the relative brightness or colour of objects in an
image.
Photographic tone influenced by Reflectivity of an object, Angle of the
reflected light, Geographic latitude, Type of photography and film sensitivity,
Light transmission of filters, and Photographic processing.
Tone
12.2.2 Photographic Texture
Signifies the frequency of change and arrangement of tones in a
photographic image. Texture is produced by an aggregation of unit features.
It determines the overall visual “smoothness” or “coarseness”. Texture
distinguish two objects with the same tone.Texture is dependent on the
scale of aerial photograph. As the scale is reduced the texture progressively
becomes finer and ultimately disappears.
110 | P a g e
Rough Texture: Irregularly dissipated object
Patterns are the spatial arrangement of objects and give genetic relation.
The orderly repetition of aggregate features in certain geometric or
planimentric arrangements. Ex. Fold pattern, drainage pattern, outcrop, and
lithological pattern.
Miscellaneous keys Association
The occurrence of certain features in relation to others. Eg. River and
drainage, buildings and roads, open cast mine and trenches.
111 | P a g e
12.2.6 Site
Statement of an object’s position in relation to others in its vicinity and
usually aids in its identification. Eg. certain vegetations or tree species are
expected to occur on well drained uplands or in certain countries).
S12.2.7 hadow
Let us Sum Up
The process of analysing and extracting valuable information from satellite
photographs is known as aerial photo interpretation.
The assignment of objects, characteristics, or locations to classes based on
their appearance on imagery is known as classification.
Enumeration is the process of listing or counting discrete objects seen on
an image.
112 | P a g e
The features of image aspects of aerial photography, such as tone, size,
shape, texture, pattern, shadow, site, and association are used to extract
information from aerial photography.
The association is the relationship between other recognised items or
features near the target of interest.
Glossary
The features of image aspects of aerial photography, such as tone, size,
shape, texture, pattern, shadow, site, and association, are used to extract
information from aerial photography.
The detection and measurement of light waves in the optical region of the
electromagnetic spectrum is known as radiometry.
Photometry is the study of measuring light, which is perceived by the human
eye in terms of brightness.
Suggested Readings
1. Campbell J., 1989. Introduction to Remote Sensing, Guilford, New
York.
113 | P a g e
2. Thomas M., Lillesand, Ralph, Kefer, W., 1994. Remote Sensing and
Image Interpretation, John Wiley & Sons, New York.
3. Campbell J., 1989. Introduction to Remote Sensing, Guilford, New
York.
4. Thomas M., Lillesand, Ralph, Kefer, W., 1994. Remote Sensing and
Image Interpretation, John Wiley & Sons, New York.
5. Luder D., 1959. Aerial Photography Interpretation: Principles and
Application, McGraw Hill, New York,
6. Thomas M., Lillesand, Ralph, Kefer, W., 1994. Remote Sensing and
Image Interpretation, John Wiley & sons, New York.
7. photography/concepts-aerial-photography/9687
Web Sources
1. https://natural-resources.canada.ca/
2. https://en.wikipedia.org/
3. https://ibis.geog.ubc.ca/courses/geob373/lectures/Handouts/lecture05. pdf
4. https://www.nrcan.gc.ca/maps-tools-publications/satellite-imagery-air-
photos/air-photos/national-air-photo-library/about-aerial-
114 | P a g e
BLOCK 5
115 | P a g e
Unit 13
Types of Satellites: Geostationary and Sun-
synchronous Satellites
Structure
Overview
Learning Objectives
Overview
A satellite or artificial satellite is an object intentionally placed into orbit in
outer space. Except for passive satellites, most satellites have an electricity
generation system for equipment on board, such as solar panels or
radioisotope thermoelectric generators (RTGs). Most satellites also have a
method of communication to ground stations, called transponders. Many
satellites use a standardized bus to save cost and work, the most popular
of which is small CubeSats. Similar satellites can work together as a group,
116 | P a g e
forming constellations. Because of the high launch cost to space, satellites
are designed to be as lightweight and robust as possible. Satellites are
placed from the surface to orbit by launch vehicles, high enough to avoid
orbital decay by the atmosphere. Satellites can then change or maintain the
orbit by propulsion, usually by chemical or ion thrusters. In 2018, about 90%
of satellites orbiting Earth are in low Earth orbit or geostationary orbit;
geostationary means the satellites stay still at the sky. Some imaging
satellites chose a Sun-synchronous orbit because they can scan the entire
globe with similar lighting. As the number of satellites and space debris
around Earth increases, the collision threat are becoming more severe. A
small number of satellites orbit other bodies (such as the Moon, Mars, and
the Sun) or many bodies at once (two for a halo orbit, three for a Lissajous
orbit).
Learning Objectives
After Learning this lesson, you will be able to:
• Know about Types of Satellites
• Geostationary and Sun-synchronous Satellites
117 | P a g e
• GEO (Geostationary Earth Orbit) at about 36,000km above the
earth's surface.
• LEO (Low Earth Orbit) at about 500-1500km above the earth's
surface.
• MEO (Medium Earth Orbit) or ICO (Intermediate Circular Orbit) at
about 6000-20,000 km above the earth's surface.
• HEO (Highly Elliptical Orbit)
• Sun-synchronous orbit (SSO) satellites.
13.1.1 Geostationary Satellites
A geostationary orbit, also referred to as a geosynchronous equatorial orbit
(GEO), is a circular geosynchronous orbit 35,786 km (22,236 mi) in altitude
above Earth's Equator (42,164 km (26,199 mi) in radius from Earth's center
and following the direction of Earth's rotation. An object in such an orbit has
an orbital period equal to Earth's rotational period, one side real day, and
so to ground observers it appears motionless, in a fixed position in the sky.
The concept of a geostationary orbit was popularised by the science fiction
writer Arthur C. Clarke in the 1940s to revolutionise telecommunications,
and the first satellite to be placed in this kind of orbit was launched in 1963.
Communication satellites are often placed in a geostationary orbit so that
Earth-based satellite antennas (located on Earth) do not have to rotate to
track them but can be pointed permanently at the position in the sky where
the satellites are located. Weather satellites are also placed in this orbit for
real-time monitoring and data collection, and navigation satellites to provide
a known calibration point and enhance GPS accuracy. Geostationary
satellites are launched via a temporary orbit and placed in a slot above a
particular point on the Earth's surface. The orbit requires some station
keeping keeping its position, and modern retired satellites are placed in a
higher graveyard orbit to avoid collisions.
Advantages of GEO satellite
• Three Geostationary satellites are enough for complete coverage of
almost any spot-on earth.
• Receivers and senders can use fixed antenna positions, no
adjusting is needed.
• GEOs are ideal for TV and radio broadcasting.
• Lifetime expectations for GEOs are rather high, at about 15 years.
118 | P a g e
• Geostationary satellites have a 24-hour view of a particular area.
• GEOs typically do not need handover due to the large footprints.
• GEOs don't exhibit any Doppler shift because the relative movement
is zero.
• Shading of the signals in cities due to high buildings and the low
elevation further away from the equator limits transmission quality.
• The transmit power needed is relatively high (about 10 W) which
causes problems for battery powered devices.
• These satellites can't be used for small mobile phones.
• The biggest problem for voice and also data communication is high
latency of over 0.25s one way-retransmission schemes which are
known from fixed networks fail.
• Transferring a GEO into orbit is very expensive.
119 | P a g e
13.1.2 LEO (Low Earth Orbit)
As LEOs circulate on a lower orbit, it is obvious that they exhibit a much
shorter period (the typical duration of LEO periods is 95 to 120 minutes).
Additionally, LEO systems try to ensure a high elevation for every spot-on
earth to provide a high-quality communication link. Each LEO satellite will
only be visible from the earth for about ten minutes.
A further classification of LEOs into little LEOs with low bandwidth services
(some 100 bit/s), big LEOs (some 1,000 bit/s) and broadband LEOs with
plans reaching into the Mbits/s range can be found in Comparetto (1997).
LEO satellites are much closer to earth than GEO satellites, ranging from
500 to 1,500 km above the surface. LEO satellites do not stay in fixed
position relative to the surface and are only visible for 15 to 20 minutes each
pass.
Advantages of LEO satellite
• Using advanced compression schemes, transmission rates of about
2,400 bit/s can be enough for voice communication.
• LEOs even provide this bandwidth for mobile terminals with omni-
directional antennas using low transmit power in the range of 1 W.
• A LEO satellite’s smaller area of coverage is less of a waste of
bandwidth.
• Using advanced compression schemes, transmission rates of about
2,400 bit/s can be enough for voice communication.
• A LEO satellite's proximity to earth compared to a Geostationary
satellite gives it a better signal strength and less time delay, which
makes it better for point-to-point communication.
• Smaller footprints of LEOs allow for better frequency reuse, like the
concepts used for cellular networks.
Disadvantages of LEO satellite
• The biggest problem of the LEO concept is the need for many
satellites if global coverage is to be reached.
• The high number of satellites combined with the fast movement
results in a high complexity of the whole satellite system.
• The short time of visibility with a high elevation requires an additional
mechanism for connection handover between different satellites.
• One general problem of LEO is the short lifetime of about five to
120 | P a g e
eight years due to atmospheric drag and radiation from the inner Van
Allen belt.
• The low latency via a single LEO is only half of the story.
• Other factors are the need for routing of data packets from satellite
to satellite (or several times from base stations to satellites and
back) if a user wants to communicate around the world.
• A GEO typically does not need this type of routing, as senders and
receivers are most likely in the same footprints.
13.1.3 MEO (Medium Earth Orbit)
• A MEO satellite is situated in orbit somewhere between 6,000 km to
20,000 km above the earth's surface.
• MEO satellites are similar to LEO satellites in the context of
functionality.
• MEO satellites are similar to LEO satellite in functionality.
• Medium earth orbit satellites are visible for much longer periods of
time than LEO satellites, usually between 2 to 8 hours.
• MEO satellites have a larger coverage area than Low Earth Orbit
satellites.
121 | P a g e
Disadvantages of MEO
• Again, due to the larger distance to the earth, delay increases to
about 70-80 ms.
• The satellites need higher transmit power and special antennas for
smaller footprints.
• A MEO satellite's distance gives it a longer time delay and weaker
signal than LEO satellite.
13.1.4 HEO (High Earth Orbit)
• The High Earth orbit satellite is the only non-circular orbit of the four
types.
• HEO satellite operates with an elliptical orbit, with a maximum
altitude (apogee) like GEO, and a minimum altitude (perigee) like
LEO.
• The HEO satellites are used for special applications where coverage
of high latitude locations is required.
131.5 Sun-synchronous Satellites
A Sun-synchronous orbit (SSO), also called a Helio synchronous orbit, is a
nearly polar orbit around a planet, in which the satellite passes over any
given point of the planet's surface at the same local mean solar time. More
technically, it is an orbit arranged so that it processes through one complete
revolution each year, so it always maintains the same relationship with the
Sun. A Sun-synchronous orbit is useful for imaging, reconnaissance, and
weather satellites, because every time that the satellite is overhead, the
surface illumination angle on the planet underneath it is nearly the same.
This consistent lighting is a useful characteristic for satellites that image the
Earth's surface in visible or infrared wavelengths, such as weather and spy
satellites, and for other remote-sensing satellites, such as those carrying
ocean and atmospheric remote-sensing instruments that require sunlight.
For example, a satellite in Sun-synchronous orbit might ascend across the
equator twelve times a day, each time at approximately 15:00 mean local
time.
122 | P a g e
Fig. 13.3 Geostationary and Sun-synchronous Orbit
13. 2 Types of satellites based on the applications
Providing communication and television services is only the tip of the
iceberg when it comes to the use of space-based technology. Many types
of satellites have been launched in recent years for a wide variety of
scientific purposes, including Earth observation, meteorological study,
navigation, studying the effects of space flight on living organisms, and
gaining insight into the cosmos. Today, the most common four types of
satellites based on their application are:
• communication.
• Earth observation.
• navigation.
• astronomical.
Our in-depth examination of the characteristics of different types of satellites
and their functions continues below.
13.2.1 Communication Satellites
A communication spacecraft, usually located at GEO and equipped with a
transponder — an integrated receiver and transmitter of radio signals —
may receive signals from Earth and retransmit them back to the planet. As
a result, it opens interaction channels between regions that were previously
123 | P a g e
unable to communicate with one another due to large distances or other
obstacles. Different types of communication satellites facilitate various
forms of media transmissions, such as radio, TV, telephone, and the
Internet.
Using the communication type of spacecraft, you can relay many signals at
once. Spacecraft for broadcasting and TV signal distribution to ground-
based stations typically have individual transponders for each carrier. In
most cases, though, several carriers will be relayed by a single transponder.
Due to its compatibility with mobile terminals, this type of satellites is ideally
suited for long-distance communication.
13.2.2 Earth Observation Satellites
The purpose of Earth observation type of satellites is to monitor our planet
from space and report back on any changes they observe. This type of
space technology makes possible consistent and repeatable environmental
monitoring as well as rapid analysis of events during emergencies like
natural disasters and armed conflicts.
The goals of the surveillance mission determine the type of satellite sensors
used for Earth observation. Information collected varies depending on the
type of sensor employed and the available frequency bands.
Our first EOS SAT constellation satellite, EOS SAT-1, is now orbiting the
Earth on the mission to improve farming and forest management through
precision technology. Eleven spectral bands of the EOS SAT-1 are
specifically designed to monitor diverse agriculture and forestry aspects,
from the presence of crop diseases to soil moisture.
124 | P a g e
13.2.3 Navigation Satellites
The navigation system constellations are located between 20,000 and
37,000 kilometers from Earth’s surface. This type of satellite sends out
signals that reveal their time, position in space, and health status. There are
two major types of space navigation systems:
• The spacecraft of the Global Navigation Satellite System
(GNSS) broadcast signals that GNSS receivers pick up and utilize
for geolocation purposes, providing global coverage. Galileo in
Europe, GPS in the United States, and the BeiDou Navigation
Satellite System in China are all examples of GNSS.
• The Regional Navigation Satellite System (RNSS) is an
autonomous regional navigation system that provides coverage on
a regional scale. For instance, India’s IRNSS project aims to provide
Indian citizens with a reliable location-based service.
13.2.4 Astronomical Satellites
Basically, an astronomical satellite is a giant telescope in orbit. It can see
well without interference from the Earth’s atmosphere, and its infrared
imaging technology can function normally without being fooled by the
planet’s surface temperature. The satellite type used for astronomy has a
vision that is up to ten times better than the most powerful telescope on
Earth.
Spacecraft used in astronomy can be broken down into several distinct
types:
• Astronomy satellites are used to investigate different types of
celestial bodies and phenomena in space, from the creation of stars
and planetary surface maps and taking images of the planets in our
solar system to the study of black holes.
• The use of climate research satellites fitted with specific types of
sensors allows scientists to gather comprehensive, multi-faceted
data on the world’s oceans and ice, land, biosphere, and
atmosphere.
• Space-based studies on plants and animal cells and structures are
possible thanks to biosatellites. Because they allow scientists from
different regions to work together, this type of spacecraft plays a
crucial role in the progress of medicine and biology.
Most satellites can perform more than one function simultaneously. Still, it’s
a common recommendation that researchers diversify the types of satellites
125 | P a g e
they use to obtain more comprehensive and accurate results of their
studies. EOSDA Land Viewer is a helpful tool for this because it aggregates
space-collected imagery (including high-resolution) from multiple sources
and provides a user-friendly interface for finding and downloading the
images you need.
Let us Sum Up
Satellites are placed from the surface to orbit by launch vehicles, high
enough to avoid orbital decay by the atmosphere. Satellites can then
change or maintain the orbit by propulsion, usually by chemical or ion
thrusters. In 2018, about 90% of satellites orbiting Earth are in low Earth
orbit or geostationary orbit; geostationary means the satellites stay still at
the sky. Some imaging satellites chose a Sun-synchronous orbit because
they can scan the entire globe with similar lighting. As the number of
satellites and space debris around Earth increases, the collision threat is
becoming more severe.
Glossary
Geostationary orbit: A geostationary orbit, also referred to as a
geosynchronous equatorial orbit (GEO), is a circular geosynchronous orbit
35,786 km (22,236 mi) in altitude above Earth's Equator (42,164 km (26,199
mi) in radius from Earth's center) and following the direction of Earth's
rotation.
Sun-synchronous orbit: A Sun-synchronous orbit (SSO), also called a
heliosynchronous orbit, is a nearly polar orbit around a planet, in which the
satellite passes over any given point of the planet's surface at the same
local mean solar time. More technically, it is an orbit arranged so that it
processes through one complete revolution each year, so it always
maintains the same relationship with the Sun.
126 | P a g e
Suggested Readings
1. Campbell J., 1989. Introduction to Remote Sensing, Guilford, New
York.
2. Thomas M., Lillesand, Ralph, Kefer, W., 1994. Remote Sensing and
Image Interpretation, John Wiley & Sons, New York.
3. Campbell J., 1989. Introduction to Remote Sensing, Guilford, New
York.
4. Thomas M., Lillesand, Ralph, Kefer, W., 1994. Remote Sensing and
Image Interpretation, John Wiley & Sons, New York.
5. Luder D., 1959. Aerial Photography Interpretation: Principles and
Application, McGraw Hill, New York,
6. Thomas M., Lillesand, Ralph, Kefer, W., 1994. Remote Sensing and
Image Interpretation, John Wiley & sons, New York.
7. photography/concepts-aerial-photography/9687
Web Sources
1. https://eos.com/blog/types-of-satellites/
2. https://www.javatpoint.com/types-of-satellite-systems
127 | P a g e
Unit 14
Resolution: Spatial, Spectral, Radiometric
and Temporal
Structure
Overview
Learning Objectives
14.1 Resolution
14.1.1 Spatial Resolution
14.1.2. Spectral Resolution
Overview
The resolution of satellite images varies depending on the instrument used
and the altitude of the satellite's orbit. For example, the Landsat archive
offers repeated imagery at 30-meter resolution for the planet, but most of it
has not been processed from the raw data. Landsat 7 has an average return
period of 16 days. For many smaller areas, images with resolution as fine
as 41 cm can be available. Satellite imagery is sometimes supplemented
with aerial photography, which has higher resolution, but is more expensive
128 | P a g e
per square meter. Satellite imagery can be combined with vector or raster
data in a GIS provided that the imagery has been spatially rectified so that
it will properly align with other data sets.
Learning Objectives
After Learning this lesson, you will be able to:
• Know about resolutions of satellite Images.
• Know about types of resolutions.
14.1 Resolution
The term resolution is used in remote sensing to describe the resolving
power, which includes the ability to identify not just the presence of two
objects, but also their characteristics. The degree of detail shown in an
image is referred to as resolution in qualitative terms. As a result, an image
with finer details is referred to have finer resolution than with coarser details.
Satellites and other airborne platforms are particularly useful for collecting
information on a regional scale. Different satellites provide data of different
types and quality. There are four basic categories of resolutions used to
create remotely sensed images. Each Remote sensing system has four
major resolutions associated with it. These resolutions should be
understood by the scientist to extract meaningful bio-physical or hybrid
information from the remotely sensed imagery.
• Spectral
• Spatial
• Temporal
• Radiometric
• Geometric
14.1.1 Spatial Resolution
The smallest angular separation between two objects is measured by
spatial resolution. This is expressed in pixels for satellite images, and the
spatial resolution of a specific image is expressed as the number of meters
represented by each pixel. For example: The multispectral scanner for the
satellite Spot satellite 4 has a spatial resolution of 20 meters. This means
that each individual square pixel represents a 400-square-meter spatial
area. When many photos are combined and the pixel sizes are averaged to
represent a greater region, there are instances when pixel size and
resolution are not the same.
129 | P a g e
Fig. 14.1 Spatial Resolution
14.1.2. Spectral Resolution
The quantity and size of bands in the electromagnetic spectrum that a
remote sensing platform can record is referred to as spectral resolution. For
example, the first two Land sat satellites were used as the multi- spectral
scanner (MSS) to take images in four different spectral bands (green, red,
and two near-infrared bands). Hyperspectral platforms can collect hundreds
of bands across the electromagnetic spectrum (e.g., Hyperion). Spectral
resolution refers to the wavelength intervals to which the sensor can detect.
Sensors which can discriminate fine spectral differences are said to have a
high spectral resolution. In other words, this refers to the number and
dimension of specific wavelength in travel in the electromagnetic spectrum
to which a remote sensing instrument is sensitive or it is the sensing and re
coding power of the sensor in different bands of EMR. It is the ability of the
sensor to distinguish finer variation of the reflected variation from different
objects.
130 | P a g e
14.1.3. Radiometric Resolution
Radiometric resolution refers to how much information is contained in a pixel
and is measured in bits that range from 0 to a chosen power of two. Each
bit had a 2-exponent power, therefore 1 bit = 21= 2 and an 8-bit image =
28= 256 =0-255. The sensitivity of a remote sensing platform to detect slight
variations in energy, specifically radiant flux, is known as radiometric
resolution (radiant energy emitted per unit time). A passive or active sensor
is usually used in remote sensing platforms. Passive sensors record
electromagnetic radiation reflected from the earth's surface. Active sensors
use machine- made electromagnetic radiation to coat the earth's surface
and measure the amount of radiant flux that returns to the sensor.
131 | P a g e
Fig. 14. 4 Temporal Resolution
14.1.5. Geometric Resolution
Geometric resolution refers to the satellite sensor's ability to effectively
image a portion of the Earth's surface in a single pixel and is typically
expressed in terms of Ground sample distance, or GSD. GSD is a term
containing the overall optical and systemic noise sources and is useful for
comparing how well one sensor can "see" an object on the ground within a
single pixel. For example, the GSD of Landsat is ≈30m, which means the
smallest unit that maps to a single pixel within an image is ≈30m x 30m. The
latest commercial satellite (GeoEye 1) has a GSD of 0.41 m. This compares
to a 0.3 m resolution obtained by some early military film-
based Reconnaissance satellite.
132 | P a g e
For instance, satellites and airplanes are commonly used platforms for
remote sensing, and their sensors have varying spatial, spectral, and
temporal resolutions. High-resolution sensors can capture detailed images
of small areas, while low-resolution sensors capture broader areas with less
detail. The choice of the platform and sensor depends on the specific
application and the level of detail required.
14.2.2. Environmental Conditions:
Environmental conditions such as cloud cover, atmospheric conditions, and
topography can affect the resolution of remote sensing images. For
example, clouds can obscure the surface features, affecting the visibility of
the ground.
Atmospheric conditions such as haze, dust, and smoke can also affect the
accuracy of remote sensing images by scattering or absorbing radiation.
Additionally, the terrain of the area being observed can also affect the
resolution, as it can cause shadows and distortions in the images.
14.2.3. Data Processing and Analysis:
The processing and analysis of remote sensing data can also affect the
resolution of the resulting images. The level of preprocessing and calibration
of the data can significantly impact the accuracy and detail of the images.
Additionally, image enhancement techniques such as filtering, contrast
stretching, and sharpening can improve the visual quality of the images but
may also introduce artifacts and reduce the spatial resolution.
14.2.4. User Requirements and Applications:
Finally, the resolution of remote sensing images can also be influenced by
the user’s requirements and the intended application. Different applications
may require different levels of resolution, depending on the objectives of the
analysis.
For example, mapping applications may require high-resolution images to
identify and map small features accurately. Conversely, broad-scale
analyses such as land cover mapping may require lower resolution images
to cover larger areas efficiently.
Let us Sum Up
Remote Sensing is the science and art of acquiring information (spatial,
spectral, radiometric, and temporal) about material objects, area, or
phenomenon, without coming into physical contact with the objects, or area,
133 | P a g e
or phenomenon under investigation. Remotely sensed data are collected
using either passive or active remote sensing systems. Passive sensors
collect data using EMR while active sensor creates EMR, and the reflected
energy is stored. Orbital period is the amount of time it takes a satellite to
complete one rotation in its orbit around the earth. The width of the area on
the Earth's surface imaged by the sensor during a single pass is referred to
as the swath of a satellite. Radiometric resolution refers to how much
information is contained in a pixel and is measured in bits that range from 0
to a chosen power of two. Each bit had a 2-exponent power, therefore 1 bit
= 21= 2 and an 8-bit image = 28= 256 =0-255.
Glossary
Hyperion is an unusually shaped satellite of Saturn. It has an average radius
of 135 km.
Suggested Readings
1. Campbell J., 1989. Introduction to Remote Sensing, Guilford, New
York.
2. Thomas M., Lillesand, Ralph, Kefer, W., 1994. Remote Sensing and
Image Interpretation, John Wiley & Sons, New York.
3. Campbell J., 1989. Introduction to Remote Sensing, Guilford, New
York.
4. Thomas M., Lillesand, Ralph, Kefer, W., 1994. Remote Sensing and
134 | P a g e
Image Interpretation, John Wiley & Sons, New York.
5. Luder D., 1959. Aerial Photography Interpretation: Principles and
Application, McGraw Hill, New York,
6. Thomas M., Lillesand, Ralph, Kefer, W., 1994. Remote Sensing and
Image Interpretation, John Wiley & sons, New York.
7. photography/concepts-aerial-photography/9687
Web Sources
1. https://www.nrcan.gc.ca/maps-tools-and-publications/satellite-imagery-
and-photos/tutorial-fundamentals-remote-sensing/satellites-and-
sensors/satellite-characteristics-orbits-and swaths/9283
2. https://www.spatialpost.com/types-of-resolution-in-remote-sensing/
135 | P a g e
Unit 15
Visual Image Interpretation
Structure
Overview
Learning Objectives
15.1 Visual Image Interpretation
15.2.3 Shape
15.2.4 Texture
15.2.5 Pattern
15.2.6 Shadow:
15.2.7 Site
15.2.8 Association
15.3 Interpretation Key
15.4 Field Verification
Let us Sum Up
Glossary
Check Your Progress
Suggested Readings
Web Sources
Overview
Visual image interpretation is a first analysis approach to remote sensing
imagery. Here, the size, shape, and position of objects as well as the
contrast and colour saturation are analysed. Analysis of remote sensing
imagery involves the identification of various targets in an image, and those
targets may be environmental or artificial features which consist of points,
lines, or areas. Targets may be defined in terms of the way they reflect or
136 | P a g e
emit radiation. This radiation is measured and recorded by a sensor, and
ultimately is depicted as an image product such as an air photo or a satellite
image. Observing the differences between targets and their backgrounds
involves comparing different targets based on any, or all, of the visual
elements of tone, shape, size, pattern, texture, shadow, and association.
Learning Objectives
After Learning this lesson, you will be able to:
• Visual Image Interpretation
• Elements of Visual Image Interpretation
• Interpretation Keys
• Visual Interpretation using colour composite.
137 | P a g e
successful identification; in others, the use of several elements will
be required.
138 | P a g e
Fig. 15.2. Satellite image of area in (a) grey scale and in (b) standard FCC.
15.2.2 Size
The size of objects can be important in discrimination of objects and
features (single family vs. multi-family residences, scrubs vs. trees, etc.). In
the use of size as a diagnostic characteristic both the relative and absolute
sizes of objects can be important. Size can also be 5 used in judging the
significance of objects and features (size of trees related to board feet which
may be cut; size of agricultural fields related to water use in arid areas, or
amount of fertilizers used; size of runways gives an indication of the types
of aircraft that can be accommodated) as shown in figure 3. It is important
to assess the size of a target relative to other objects in a scene, as well as
the absolute size, to aid in the interpretation of that target.
139 | P a g e
15.2.3 Shape
Shape refers to the general form, configuration, or outline of an individual
object. Shape is one of the most important single factors for recognizing
object from an image. Generally regular shapes, squares, rectangles,
circles are signs of man-made objects, e.g., buildings, roads, and cultivated
fields, whereas irregular shapes, with no distinct geometrical pattern are
signs of a natural environment, e.g., a river, forest. In a general case of
misinterpretation in between roads and train line: roads can have sharp
turns, joints perpendicularly, but rails line does not. From the shape of the
following image, it can be easily said that the dark blue colored object is a
river.
140 | P a g e
Fig. 15.5: Textural variations
15.2.5 Pattern
Patterns are the spatial arrangement of objects. Pattern can be either man-
made or natural. Pattern is a macro image characteristic. It is the regular
arrangement of objects that can be diagnostic of features on the landscape.
Arrangements of complex drainage in the form of ravines can be identified
easily. Likewise, the network or grid of streets in a sub- division or urban
area can aid identification and aid in problem solving such as the growth
patterns of a city. Patterns can also be very important in geological or
geomorphological analysis. Drainage pattern can tell the trained observer a
great deal about the lithology and structural patterns in an area (figure 6).
Dendritic drainage patterns develop on flat bedded sediments; radial
on/over domes; linear or trellis in areas with faults or other structural
controls.
141 | P a g e
15.2.6 Shadow:
It is useful in interpretation as it may provide an idea of the profile and
relative height of a target or targets which may make identification easier.
However, shadows can also reduce or eliminate interpretation in their area
of influence, since targets within shadows are much less (or not at all)
discernible from their surroundings. Shadow is also useful for enhancing or
identifying topography and landforms.
142 | P a g e
another that identification of one tends to indicate or confirm the existence
of another. Smokestacks, step buildings, cooling ponds, transformer yards,
coal piles, railroad tracks, coal fired power plant. Arid terrain, basin bottom
location, highly reflective surface, sparse vegetation, play a water body
surrounded by salt pond and saline patches, salt production units.
Association is one of the most helpful clues in identifying manmade
installations. Aluminium manufacture requires large amounts of electrical
energy. Absence of a power supply may rule out this industry. Cement
plants have rotary kilns. Schools at different levels typically have
characteristic playing fields, parking lots and clusters of buildings in urban
areas.
143 | P a g e
they can consist largely of illustrations, e.g., landforms, industrial facilities,
military installations. Many types of keys are already available if you can
find or get your hands on them. This can often be very difficult and a reason
why people develop their own keys. Depending upon the way the diagnostic
features are organized, two types of keys are generally recognized. 1)
Selective keys and 2) Elimination keys. Selective keys are arranged in such
a way that an interpreter simply selects that example that most closely
corresponds to the object they are trying to identify, e.g., industries,
landforms etc. Elimination Keys are arranged so that the interpreter follows
a precise stepwise process that leads to the elimination of all items except
the one(s) that they are trying to identify. Dichotomous keys are essentially
a class of elimination key. Most interpreters prefer to use elimination keys
in their analyses (Colwell, 1997; Olson, 1960).
Let us Sum Up
The information generated from the interpretation process may be more
authentic and reliable. The relative brightness or colour of features in an
image is referred to as tone. The spatial arrangement of objects is known
as pattern. The recognizable pattern is formed by orderly repetition of similar
tones and textures. The targets are identified easier using shadow. It will
provide relative height and an idea of the targets.
Glossary
Association: Association refers to the occurrence of certain features in
relation to other objects in the imagery. In urban area a smooth vegetation
pattern generally refers to a playground or grass land not agricultural land,
144 | P a g e
Interpretation Key: The criterion for identification of an object with
interpretation elements is called an interpretation key.
Suggested Readings
1. Campbell J., 1989. Introduction to Remote Sensing, Guilford, New
York.
2. Thomas M., Lillesand, Ralph, Kefer, W., 1994. Remote Sensing and
Image Interpretation, John Wiley & Sons, New York.
Web Sources
1. http://ecoursesonline.iasri.res.in/
2. https://natural-resources.canada.ca/
3. https://www.dspmuranchi.ac.in/pdf/Blog/Fundamental%20of%20Visua
l%20Image%20Interpretation%20&%20Its%20Keys.pdf
145 | P a g e
Unit 16
Digital Image Classification
Structure
Overview
Learning Objectives
16.1 Digital Image Classification
Overview
Image classification is the process of categorizing and labelling groups of
pixels or vectors within an image into one of several predefined classes.
Image classification is a major part of digital image analysis. Classification
between objects is a difficult task and therefore image classification has
been an important task within the field of computer vision. The major steps
for image classification include training, classifying and then recognising the
pattern of the image. To classify the image, we must first understand the
relationship between data and the classes. There are three commonly used
methods for image classification they are supervised, unsupervised and
object-based image analysis.
Unsupervised classification is essentially the inverse of supervised
classification. The analyst initially groups spectral classes based on
numerical information in the data, and then matches them to information
146 | P a g e
classes. Clustering algorithms are programmes that are used to determine
the natural (statistical) groupings or structures in data. Typically, the analyst
sets the number of groups or clusters to look for the data. In addition to the
number of classes, the analyst may additionally define parameters relating
to the separation distance between clusters. As a result, unsupervised
classification is not completely without human assistance. However, unlike
supervised categorization, it does not begin with a pre-determined set of
classifications.
Learning Objectives
After Learning this lesson, you will be able to:
• Supervised Classification
• Basic steps to apply Supervised Classification
• Unsupervised Classification
147 | P a g e
• Radiometric differentiation is the detection of differences in brightness,
which may in certain cases be used to inform the image analyst as to
the nature or condition of the remotely sensed object.
• Spatial differentiation is related to the concept of spatial resolution. We
may be able to analyze the spectral content of a particular pixel or group
of pixels in a digital image when those pixels comprise a single
homogeneous material or object. It is also important to understand the
potential for mixing of the spectral signatures of multiple objects into the
recorded spectral values for a single pixel. When designing an image
analysis task, it is important to consider the size of the objects to be
discovered or studied compared to the ground sample distance of the
sensor.
148 | P a g e
16.2 Supervised Classification
Supervised classification is based on the idea that a user can select sample
pixels in an image that are representative of specific classes and then direct
the image processing software to use these training sites as references for
the classification of all other pixels in the image. Training sites are selected
based on the knowledge of the user. The user also sets the bounds for how
similar other pixels must be to group them together. These bounds are often
set based on the spectral characteristics of the training area, plus or minus
a certain increment. The user also designates the number of classes that
the image is classified into. Many analysts use a combination of supervised
and unsupervised classification processes to develop final output analysis
and classified maps.
16.2.1 Steps involved in Supervised Classification
A supervised classification algorithm requires a training sample for each
class, that is, a collection of data points known to have come from the class
of interest. The classification is thus based on how “close” a point to be
classified is to each training sample. We shall not attempt to define the word
“close” other than to say that both Geometric and statistical distance
measures are used in practical pattern recognition algorithms. The training
samples are representative of the known classes of interest to the analyst.
Classification methods that rely on the use of training patterns are called
supervised classification methods. In supervised classification, you select
representative samples for each land cover class. The software then uses
these “training sites” and applies them to the entire image. The three basic
steps involved in a typical supervised classification procedure are as
follows:
149 | P a g e
(i) Training stage: The analyst identifies representative training areas
and develops numerical descriptions of the spectral signatures of
each land cover type of interest in the scene. Training sites are areas
that are known to be representative of a particular land cover type.
The computer determines the spectral signature of the pixels within
each training area, and uses this information to define the statistics,
including the mean and variance of each of the classes. Preferably
the location of the training sites should be based on field collected
data or high-resolution reference imagery. It is important to choose
training sites that cover the full range of variability within each class to
allow the software to accurately classify the rest of the image. If the
training areas are not representative of the range of variability found
within a particular land cover type, the classification may be much less
accurate. Multiple, small training sites should be selected for each
class. The more time and effort spent in collecting and selecting
training sites the better the classification results.
(ii) The classification stage (Decision Rule) or Generate signature file:
Each pixel in the image data set IS categorized into the land cover
class it most closely resembles. If the pixel is insufficient like any
training data set it is usually labeled ‘Unknown’.
(iii) The output stage or Classify: The results may be used in several
different ways. Three typical forms of output products are thematic
maps, tables and digital data files which become input data for GIS.
The output of image classification becomes input for GIS for spatial
analysis of the terrain. Fig. 2 depicts the flow of operations to be
performed during image classification of remotely sensed data of an
area which ultimately leads to creating database as an input for GIS.
Plate 6 shows the land use/ land cover color coded image, which is
an output of image.
16.2.2 Advantages and Disadvantages of Supervised
Classification:
In supervised classification most of the effort is done prior to the actual
classification process. Once the classification is run the output is a thematic
image with classes that are labeled and correspond to information classes
or land cover types. Supervised classification can be much more accurate
than unsupervised classification, but depends heavily on the training sites,
the skill of the individual processing the image, and the spectral distinctness
of the classes. If two or more classes are very similar to each other in terms
150 | P a g e
of their spectral reflectance (e.g., annual-dominated grasslands vs.
perennial grasslands), misclassifications will tend to be high. Supervised
classification requires close attention to the development of training data. If
the training data is poor or not representative the classification results will
also be poor. Therefore, supervised classification generally requires more
time and money compared to unsupervised.
151 | P a g e
means or other principle. K-means clustering algorithm, thus, helps split a
given unknown dataset into a fixed number (k) of user defined clusters. The
objective of the algorithm is to minimise variability within the cluster.
The data point at the center of a cluster is known as a centroid. In most of
the image processing software, each centroid is an existing data point in the
given input data set, picked at random, such that all centroids are unique.
Initially, a randomised set of clusters is produced. Each centroid is thereafter
set to the arithmetic mean of cluster it defines. The process of classification
and centroid adjustment is repeated until the values of centroids stabilise.
The final centroids are used to produce final classification or clustering of
input data, effectively turning a set of initially anonymous data points into a
set of data points, each with a class identity.
Advantage
• The main advantage of this algorithm is its simplicity and speed which
allows it to run on large datasets.
Disadvantages
• it does not yield the same result with each run, since the resulting
clusters depend on the initial random assignments.
• it is sensitive to outliers, so, for such datasets k-medians clustering is
used and
• one of the main disadvantages to k-means is the fact that one must
specify the number of clusters as an input to algorithm.
16.3.2 ISODATA Clustering
ISODATA (Iterative Self-Organising Data Analysis Technique) clustering
method is an extension of k-means clustering method (ERDAS, 1999). It
represents an iterative classification algorithm and is useful when one is not
sure of the number of clusters present in an image. It is iterative because it
makes a large number of passes through the remote sensing dataset until
specified results are obtained. Good results are obtained if all bands in
remote sensing image have similar data ranges. It includes automated
merging of similar clusters and splitting of heterogeneous clusters. The
clustering method requires us to input maximum number of clusters that you
want, a convergence threshold and maximum number of iterations to be
performed. ISODATA clustering takes place in the following steps:
k arbitrary cluster means are established
• all pixels are relocated into the closest clusters by computing distance
152 | P a g e
between pixel and cluster.
• centroids of all clusters are recalculated, and above step is repeated
until the threshold convergence and
• if the number of clusters are within the specified number and distances
between the clusters meet a prescribed threshold, then only clustering
is considered complete.
Advantages
• it is good at finding “true” clusters within the data.
• it is not biased to the top pixels in the image.
Let us Sum Up
Supervised image classification is a type of classification in which the user
or image analyst supervises' the pixel classification process. The user
assigns the various pixel values or spectral signatures to each class.
Maximum likelihood is a statistical approach to pattern recognition where
the probability of a pixel belonging to each of a predefined set of classes is
calculated.
Support vector machines are supervised non-parametric statistical learning
techniques that does not require any assumptions.
153 | P a g e
Isodata Clustering involves combining clusters with identical spectral
signatures and separating clusters with high variability.
Glossary
Spectral signature: The variation of a material's reflectance or emittance
with respect to wavelengths is known as spectral signature.
Spectral classes are clusters of pixels in the data that are uniform (or nearly
uniform) in terms of their brightness values in the various spectral channels.
154 | P a g e
Suggested Readings
1. Campbell J., 1989. Introduction to Remote Sensing, Guilford, New
York.
2. Thomas M., Lillesand, Ralph, Kefer, W., 1994. Remote Sensing and
Image Interpretation, John Wiley & Sons, New York.
3. Campbell J., 1989. Introduction to Remote Sensing, Guilford, New
York.
4. Thomas M., Lillesand, Ralph, Kefer, W., 1994. Remote Sensing and
Image Interpretation, John Wiley & Sons, New York.
5. Luder D., 1959. Aerial Photography Interpretation: Principles and
Application, McGraw Hill, New York,
6. Thomas M., Lillesand, Ralph, Kefer, W., 1994. Remote Sensing and
Image Interpretation, John Wiley & Sons, New York.
7. photography/concepts-aerial-photography/9687
Web Sources
1. https://gisgeography.com/image-classification-techniques-remote- sensing/
2. https://www.ukessays.com/essays/engineering/supervised-image-
classification-9746.php
3. https://gisgeography.com/image-classification-techniques-remote- sensing/
4. https://core.ac.uk/download/pdf/234663192.pdf
5. https://sites.google.com/site/dataclusteringalgorithms/k-means- clustering-
algorithm
6. https://people.utm.my/nurulhazrina/files/2015/05/L12-Unsupervised-
classification.pdf
155 | P a g e
Document Information
Submitted by
Similarity 8%
PGGR_9B.pdf
5
Document PGGR_9B.pdf (D165634546)
URL: https://kkhsou.ac.in/eslm/E-SLM-for-
Learner/1st%20Sem/Certificate/POST%20GRADUATE%20CERTIFICAT... 23
Fetched: 2021-11-18 06:12:15
URL: https://link.springer.com/chapter/10.1007/978-1-4615-0306-4_3
3
Fetched: 2020-03-17 18:20:04
RSGIS_Unit Two..docx
18
Document RSGIS_Unit Two..docx (D154342725)
34288801_giv15ami_GISN08_1100-16.pdf
1
Document 34288801_giv15ami_GISN08_1100-16.pdf (D21966490)
URL: https://www.sciencedirect.com/topics/agricultural-and-biological-sciences/remote-sensing
1
Fetched: 2019-10-10 18:01:17
rana-9.docx
9
Document rana-9.docx (D54864731)
38955801_ka1325ad-s_GISN08_1100-16.docx
2
Document 38955801_ka1325ad-s_GISN08_1100-16.docx (D60145795)
About Tamil Nadu Open University
Tamil Nadu Open University (TNOU), with its
Headquarters at Chennai was established in 2000 by an
Act of Tamil Nadu Legislature at the State level for the
introduction and promotion of Open University and
Distance Education in the educational and for the co-
ordination and determination of standards in such system.
The salient features of TNOU are , relaxed entry rules,
maintenance of standards, individualized study, flexible in
term of place, duration of the study, use of latest
information and communication technology, well-knit
student support services network, cost effective
programmes, collaboration and resource sharing with
other Universities.
School of Sciences
School of Sciences, established in 2004, has been offering the B.Sc. and M.Sc. programmes
in Mathematics since 2005 and B.Sc., Mathematics with Computer Application since 2007. In 2017,
B.Sc. programmes in Physics, Chemistry, Botany, and Zoology were introduced, while M.Sc.
programmes in Physics, Chemistry, Botany, and Zoology were launched in 2018. As per the
academic restructured, the Department of Geography and Apparel & Fashion Design were
merged in the School of Science in 2020 and these departments are offering B.Sc., and M.Sc.,
Programmes.
The main objective is to excite the brains and hearts of rural students through constant
inquiry and active participation in Science. The School of study has blazed a trail of information
transmission and generation, graduating over 25000 Science students across the Nation. It has
built a niche for itself in the core areas of teaching, research, consultation, administration, and
community services over the last 17 years.