0% found this document useful (0 votes)
21 views164 pages

Fundamentals of Remote Sensing Course

The document outlines the syllabus for the course 'Fundamentals of Remote Sensing' (BGEOS-52) offered in the B.Sc. Geography program at Tamil Nadu Open University. It covers various aspects of remote sensing, including definitions, types, processes, and applications of aerial and satellite remote sensing, as well as radar technology. The course is designed by Dr. K. Katturajan and includes contributions from other faculty members, with a focus on practical applications and theoretical understanding of remote sensing techniques.

Uploaded by

Manoharan K
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
21 views164 pages

Fundamentals of Remote Sensing Course

The document outlines the syllabus for the course 'Fundamentals of Remote Sensing' (BGEOS-52) offered in the B.Sc. Geography program at Tamil Nadu Open University. It covers various aspects of remote sensing, including definitions, types, processes, and applications of aerial and satellite remote sensing, as well as radar technology. The course is designed by Dr. K. Katturajan and includes contributions from other faculty members, with a focus on practical applications and theoretical understanding of remote sensing techniques.

Uploaded by

Manoharan K
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

BGEOS_52

B.Sc., Geography
Semester - V

FUNDAMENTALS OF REMOTE
SENSING
Department of Geography
School of Sciences
Tamil Nadu Open University
Chennai – 600 015.
BACHELOR OF SCIENCE IN GEOGRAPHY

FUNDAMENTALS OF REMOTE SENSING

BGEOS-52

SEMESTER-V

Department of Geography,
School of Science
Tamil Nadu Open University
577, Anna Salai, Saidapet, Chennai - 600 015
www.tnou.ac.in

August 2023
Name of Programme: B.Sc Geography

Course Code: BGEOS-52

Course Title: Fundamentals of Remote Sensing

Course Design: Dr. K. Katturajan


Assistant Professor,
Department of Geography,
School of Science,
Tamil Nadu Open University, Chennai -600 015

Course Writers: Dr. Yasodaran Suresh


Assistant Professor,
Department of Geography,
Madras Christian College, Tambaram,
Chennai- 600059.

Course Coordinator & Editor: Dr. K. Katturajan

August 2023 (First Edition)

ISBN No: 978-93-5753-403-1

© Tamil Nadu Open University, 2023


All rights reserved. No part of this work can be reproduced in any form, by mimeograph or any other
means, without permission in writing from the Tamil Nadu Open University. Course Writer is the sole
responsible person for the contents presented / available in the Course Materials. Further information on
the Tamil Nadu Open University Academic Programmes may be obtained from the University Office at
577, Anna Salai, Saidapet, Chennai-600 015. [or] www.tnou.ac.in

@TNOU, 2023 “Fundamentals of Remote Sensing” is made available under a


Creative Commons Attribution-Share Alike 4.0 License (International)
https://creativecommons.org/licenses/by-sa/4.0/

Printed by:
BGEOS 52 - FUNDAMENTALS OF REMOTE SENSING
Syllabus Details

Block 1: Remote Sensing


1. Definition and Types: Aerial, Satellite and Radar
2. History, Organization and Development of Space Programmes
Block 2: Remote Sensing Processes
3. Introduction to Remote Sensing
4. Sources of Energy and Electromagnetic Radiations (EMR)
5. Electromagnetic Spectrum and Atmospheric Windows
6. Energy Interaction with Atmosphere and Earth
Block 3: Types of Remote Sensing and Scanners
7. Platforms, Types of Platforms and its Characteristics
8. Active and Passive, Optical-Mechanical Scanners and Push-Broom Scanners
9. Thermal Remote Sensing and Ideal Remote Sensing Systems
Block 4: Fundamentals of Aerial Remote Sensing
10. Aerial Photo Imaging System and Types of Aerial Photographs
11. Marginal Information of Aerial Photographs
12. Elements of Photo Interpretation
Block 5: Fundamentals of Satellite Remote Sensing
13. Types of Satellites: Geostationary and Sun-synchronous Satellites
14. Resolution: Spatial, Spectral, Radiometric and Temporal
15. Visual Image Interpretation
16. Digital Image Classification
References:
1. Sarkar, A. (2015): Practical Geography: A Systematic Approach. Orient Black Swan
Private Ltd., New Delhi.
2. Anji Reddy, M. (2008): Textbook of Remote Sensing and Geographic Information
System, B.S. Publication, Hyderabad.
3. Campbell, J. B., (2007): Introduction to Remote Sensing, Guildford Press.
4. Chauniyal, D.D., (2010): SudurSamvedanevam Bhogolik Suchana Pranali (Hindi),
Sharda Pustak Bhawan, Allahabad. 34.
5. Jensen, J.R. (2007): Remote Sensing of the Environment: An Earth Resource
Perspective, Prentice-Hall Inc., New Jersey.
6. Joseph, G. (2005): Fundamentals of Remote Sensing, United Press India.
7. Kumar, Dilip, Singh, R.B. and Kaur, Ranjeet (2019): Spatial Information Technology
for Sustainable Development Goals, Springer.
8. Lillisand, T.M., and Kiefer, P.W., (2007): Remote Sensing and Image Interpretation,
6th Edition, John Wiley & Sons, New York.
Web Sources

1. https://gistbok.ucgis.org/bok-topics
2. https://gisgeography.com/remote-sensing-earth-observation-guide/
3. https://www.iirs.gov.in/
4. https://www.heavy.ai/technical-glossary/remote-sensing
BGEO21_52 - FUNDAMENTALS OF REMOTE SENSING
Unit Page
Contents
No. Number

Block 1: Remote Sensing

1 Definition and Types: Aerial, Satellite and Radar 2

2 History, Organization and Development of Space Programmes 11

Block 2: Remote Sensing Processes

3 Introduction to Remote Sensing 28

4 Sources of Energy and Electromagnetic Radiations (EMR) 34

5 Electromagnetic Spectrum and Atmospheric Windows 42

6 Energy Interaction with Atmosphere and Earth 51

Block 3: Types of Remote Sensing and Scanners

7 Platforms, Types of Platforms, and its Characteristics 61


Active and Passive, Optical-Mechanical Scanners and Push-Broom
8 74
Scanners

9 Thermal Remote Sensing and Ideal Remote Sensing Systems 80

Block 4: Fundamentals of Aerial Remote Sensing

10 Aerial Photo Imaging System and Types of Aerial Photographs 91

11 Marginal Information of Aerial Photographs 101

12 Elements of Photo Interpretation 108

Block 5: Fundamentals of Satellite Remote Sensing

13 Types of Satellites: Geostationary and Sun-synchronous Satellites 116

14 Resolution: Spatial, Spectral, Radiometric and Temporal 128

15 Visual Image Interpretation 136

16 Digital Image Classification 146


Block 1
Remote Sensing

Unit 1: Definition and Types: Aerial, Satellite and Radar

Unit 2: History, Organization and Development of Space


Programmes

1|Page
Unit 1
Definition and Types: Aerial, Satellite and
Radar
Structure
Overview
Learning Objectives

1.1 Aerial Photography


1.1.1 Types of Aerial Photographs
1.2 Satellites

1.2.1 Types of Satellites


1.3 Radar
1.3.1 Types of Radar
Let Us Sum Up
Glossary
Check Your Progress
Suggested Readings

Overview
An aerial photograph, in broad terms, is any photograph taken from the air.
Normally, air photos are taken vertically from an aircraft using a highly
accurate camera. There are several things you can look for to determine
what makes one photograph different from another of the same area
including type of film, scale, and overlap. Other important concepts used in
aerial photography are stereoscopic coverage, fiducial marks, focal length,
roll and frame numbers, and flight lines and index maps. Earth observation
satellites gather information for reconnaissance, mapping, monitoring the
weather, ocean, forest, etc. Space telescopes take advantage of outer
spaces near perfect vacuum to observe objects with the entire
electromagnetic spectrum. Because satellites can see a large portion of the
Earth at once, communications satellites can relay information to remote
places.

The signal delay from satellites and their orbit's predictability are used in
satellite navigation systems, such as GPS. Space probes are satellites
designed for robotic space exploration outside of Earth, and space stations

2|Page
are in essence crewed satellites. Radar was developed secretly for military
use by several countries in the period before and during World War II. A key
development was the cavity magnetron in the United Kingdom, which
allowed the creation of relatively small systems with sub-meter resolution.
The term RADAR was coined in 1940 by the United States Navy as an
acronym for "radio detection and ranging”. The modern uses of radar are
highly diverse, including air and terrestrial traffic control, radar astronomy,
air-defense systems, anti-missile systems, marine radars to locate
landmarks and other ships, aircraft anti-collision systems, ocean
surveillance systems, outer space surveillance and rendezvous systems,
meteorological precipitation monitoring, altimetry and flight control systems,
guided missile target locating systems, self-driving cars, and ground-
penetrating radar for geological observations.
Learning Objectives
After studying this unit, you would be able to understand the following.
• Aerial and its types
• Satellite and its types
• Radar and its types

1.1 Aerial Photography


This is the most common form of remote sensing data. The first commercial
companies to do aerial survey come into existence in the early 1920s. Then
lots of experiments were made to improve the surveying by using different
image formats, camera, lens, and films.
• Multispectral camera is mounted in the aircraft in conjunction with a
survey camera.
• Generally monochrome films are used because they are simple to
process and do not need extremely rigorous laboratory conditions.
• This film is available in several types and at speeds which allow
adequate photography to be taken over a wide range of weather and
lighting conditions.
• The exposed film must be sent to a fully equipped laboratory; the
photographic cover cannot be checked within hours of being taken.
The aerial photographs are classified based on the position of the camera
axis, scale, angular extent of coverage and the film used. The types of aerial
photographs based in the position of optical axis and the scale are given
below.

3|Page
1.1.1 Types of Aerial Photographs based on the position of the
Camera Axis
Based on the position of the camera axis, aerial photographs are classified
into the following types.
a. Vertical Photographs
While taking aerial photographs, two distinct axes are formed from the
camera lens centre, one towards the ground plane and the other towards
the photo plane. The perpendicular dropped from the camera lens centre to
the ground plane is termed as the vertical axis, whereas the plumb line
drawn from the lens centre to the photo plane is known as the
photographic/optical axis. When the photo plane is kept parallel to the
ground plane, the two axes also coincide with each other. The photograph
so obtained is known as vertical photograph. However, it is normally very
difficult to achieve perfect parallelism between the two planes because the
aircraft flies over the curved surface of the earth. The photographic axis,
therefore, deviates from the vertical axis. If such a deviation is within the
range of plus or minus 3°, the near vertical aerial photographs are obtained.
Any photography with an unintentional deviation of more than 3° in the
optical axis from the vertical axis is known as a titled photograph.

Fig. 1.1 Vertical Aerial photograph

4|Page
b. Low Oblique
An aerial photograph taken with an intentional deviation of 15° to 30° in the
camera axis from the vertical axis is referred to as the low oblique
photograph. This kind of photograph is often used in reconnaissance
surveys.

Fig. 1.2 Low Oblique photograph


c. High Oblique
The high oblique are photographs obtained when the camera axis
intentionally inclined about 60° from the vertical axis. Such photography is
useful in reconnaissance surveys.

Fig. 1.3 High Oblique Photograph

5|Page
1.2 Satellites
A satellite or artificial satellite is an object intentionally placed into orbit in
outer space. Except for passive satellites, most satellites have an electricity
generation system for equipment on board, such as solar panels or
radioisotope thermoelectric generators (RTGs). Most satellites also have a
method of communication to ground stations, called transponders. Many
satellites use a standardized bus to save cost and work, the most popular
of which is small CubeSats. Similar satellites can work together as a group,
forming constellations. Because of the high launch cost to space, satellites
are designed to be as lightweight and robust as possible.

Satellites are placed from the surface to orbit by launch vehicles, high
enough to avoid orbital decay by the atmosphere. Satellites can then
change or maintain the orbit by propulsion, usually by chemical or ion
thrusters. In 2018, about 90% of satellites orbiting Earth are in low Earth
orbit or geostationary orbit; geostationary means the satellites stay still at
the sky. Some imaging satellites chose a Sun-synchronous orbit because
they can scan the entire globe with similar lighting. As the number of
satellites and space debris around Earth increases, the collision threat is
becoming more severe.
1.2.1 Types of Satellites
Four different types of satellite orbits have been identified depending on the
shape and diameter of each orbit:
• GEO (Geostationary Earth Orbit) at 36,000 kms above earth’s
surface.
• LEO (Low Earth Orbit) at 500-1500 kms above earth’s surface.

• MEO (Medium Earth Orbit) or ICO (Intermediate Circular Orbit) at


6000-20000 kms above earth’s surface.
• HEO (Highly Elliptical Orbit).
Most GEO satellites are geostationary, with zero incline above the equator,
meaning they are in the same point all of the time. This makes them very
useful for communication satellites – hence this is the second most common
orbit with over 20% of the spacecraft (554 satellites) in this orbit.
Typically, LEO satellites have an orbit time of around 90 to 120 minutes.
Almost all human activity in space is LEO, including the ISS. The ability to
get close to Earth for reconnaissance, the high speed of orbit and the fast
transfer of data makes this the most popular area of satellites with over 72%
of satellites in this space.

6|Page
MEO refers to satellites between the LEO and GEO orbits yet, despite the
huge range distances only 5% of satellites operate in this space. This area
is used largely by navigation satellites, like the European Galileo system.
This is the least common of the satellite orbits, with just 2% in this place.
The vast majority of these are military or government missions, with just
three that are for commercial use.

1.3 Radar
Radar is a detection system that uses radio waves to determine the distance
(ranging), angle, and radial velocity of objects relative to the site. It can be
used to detect aircraft, ships, spacecraft, guided missiles, motor vehicles,
weather formations, and terrain. A radar system consists of a transmitter
producing electromagnetic waves in the radio or microwaves domain, a
transmitting antenna, a receiving antenna (often the same antenna is used
for transmitting and receiving) and a receiver and processor to determine
properties of the objects. Radio waves (pulsed or continuous) from the
transmitter reflect off the objects and return to the receiver, giving
information about the objects' locations and speeds.
1.3.1 Types of Radar
a) Bistatic Radar
This type of radar system includes a Tx-transmitter & an Rx- receiver that is
divided through a distance that is equivalent to the distance of the estimated
object. The transmitter & the receiver are situated at a similar position is
called a monastic radar whereas the very long-range surface to air & air to
air military hardware uses the bistatic radar.
b) Doppler Radar
It is a special type of radar that uses the Doppler Effect to generate data
velocity regarding a target at a particular distance. This can be obtained by
transmitting electromagnetic signals in the direction of an object so that it
analyses how the action of the object has affected the returned signal’s
frequency. This change will give very precise measurements for the radial
component of an object’s velocity within relation toward the radar. The
applications of these radars involve different industries like meteorology,
aviation, healthcare, etc.
c) Monopulse Radar

This kind of radar system compares the obtained signal using a particular
radar pulse next to it by contrasting the signal as observed in numerous
directions otherwise polarizations. The most frequent type of monopulse

7|Page
radar is the conical scanning radar. This kind of radar evaluates the return
from two ways to measure the position of the object directly. It is significant
to note that the radars which are developed in the year 1960 are monopulse
radars.
d) Passive Radar
This kind of radar is mainly designed to notice as well as follow the targets
through processing indications from illumination within the surroundings.
These sources comprise communication signals as well as commercial
broadcasts. The categorization of this radar can be done in the same
category of bistatic radar.
e) Instrumentation Radar
These radars are designed for testing aircraft, missiles, rockets, etc. They
give different information including space, position, and time both in the
analysis of post-processing & real-time.
f) Weather Radars
These are used to detect the direction and weather by using radio signals
through circular or horizontal polarization. The frequency choice of weather
radar mainly depends on a compromise of performance among attenuation
as well as precipitation reflection as an outcome of atmospheric water
steam. Some types of radars are mainly designed to employ Doppler shifts
to calculate the wind speed as well as dual polarization to recognize the
types of rainfall.
g) Mapping Radar
These radars are mainly used to examine a large geographical area for the
applications of remote sensing & geography. As a result of synthetic
aperture radar, these are restricted to quite stationary targets. There are
some radar systems used to detect humans after walls that are more
different as compared with the ones found within construction materials.
h) Navigational Radars
Generally, these are the same to search radars but, they are available with
small wavelengths that are capable of replicating from the ground & from
stones. These are commonly used on commercial ships as well as long-
distance airplanes. There are different navigational radars like marine
radars which are placed commonly on ships to avoid a collision as well as
navigational purposes.

8|Page
i) Pulsed Radar
Pulsed RADAR sends high power and high-frequency pulses towards the
target object. It then waits for the echo signal from the object before another
pulse is sent. The range and resolution of the RADAR depend on the pulse
repetition frequency. It uses the Doppler shift method. The principle of
RADAR detecting moving objects using the Doppler shift works on the fact
that echo signals from stationary objects are in the same phase and hence
get cancelled while echo signals from moving objects will have some
changes in phase.

Let Us Sum Up
An aerial photograph, in broad terms, is any photograph taken from the air.
Normally, air photos are taken vertically from an aircraft using a highly
accurate camera. There are several things you can look for to determine
what makes one photograph different from another of the same area
including type of film, scale, and overlap. A satellite or artificial satellite is an
object intentionally placed into orbit in outer space. Except for passive
satellites, most satellites have an electricity generation system for
equipment on board, such as solar panels or radioisotope thermoelectric
generators (RTGs).

Glossary
Vertical Photographs: While taking aerial photographs, two distinct axes are
formed from the camera lens centre, one towards the ground plane and the
other towards the photo plane.
Low Oblique: An aerial photograph taken with an intentional deviation of 15°
to 30° in the camera axis from the vertical axis is referred to as the low
oblique photograph. This kind of photograph is often used in
reconnaissance surveys.
High Oblique: The high oblique are photographs obtained when the camera
axis intentionally inclined about 60° from the vertical axis. Such photography
is useful in reconnaissance surveys.

Check Your Progress


1. What are the types of Aerials Photographs?
• Vertical Photograph

• Low Oblique
• High Oblique

9|Page
2. Types of Satellites
• Geostationary Earth Orbit
• Low Earth Orbit

• Medium Earth Orbit


• Highly Elliptical Orbit
3. Types of Radar

• Bistatic Radar
• Doppler Radar
• Monopulse Radar

• Passive Radar
• Instrumentation Radar
• Weather Radars
• Mapping Radar
• Navigational Radars
• Pulsed Radar

Suggested Readings
1. Campbell J., 1989. Introduction to Remote Sensing, Guilford, New York.
2. Thomas M., Lillesand, Ralph, Kefer, W., 1994. Remote Sensing and
Image Interpretation, John Wiley & Sons, New York.
3. Sarkar, A. (2015): Practical Geography: A Systematic Approach. Orient
Black Swan Private Ltd., New Delhi.
4. Anji Reddy, M. (2008): Textbook of Remote Sensing and Geographic
Information System, B.S. Publication, Hyderabad.
5. Campbell, J. B., (2007): Introduction to Remote Sensing, Guildford
Press.

6. Chauniyal, D.D., (2010): Sudur Samvedanevam Bhogolik Suchana


Pranali (Hindi), Sharda Pustak Bhawan, Allahabad. 34.
7. Jensen, J.R. (2007): Remote Sensing of the Environment: An Earth
Resource Perspective, Prentice-Hall Inc., New Jersey.
8. Joseph, G. (2005): Fundamentals of Remote Sensing, United Press
India.

10 | P a g e
Unit 2
History, Organization and Development of
Space Programmes
Structure
Overview
Learning Objectives

2.1 Development of Remote Sensing


2.2 Milestones of Remote Sensing Development
2.3 History of India’s Space Programme

2.3.1 Communication Satellites


2.3.2 Earth Observation Satellites
2.3.3 Satellite Navigation
2.3.4 Standard Positioning Service (SPS) Restricted Service (RS)
2.3.5 Mars Orbiter Mission
2.3.6 Satellite Launchers
Let Us Sum Up
Glossary
Check Your Progress
Suggested Readings
Web Sources

Overview
Remote sensing began in the 1840s as balloonists took pictures of the
ground using the newly invented photo camera. Perhaps the most novel
platform at the end of the last century is the famed pigeon fleet that operated
as a novelty in Europe. The concept of photography was developed by
Greek mathematician Aristotle by using a pinhole camera in the 5th and 4th
centuries. Photography was originally invented in the early 1800s. The
world’s first chemical photograph was taken in 1826 by Joseph Niepce of
France using a process known as heliography. In this unit you can learn the
historical development of remote sensing and the development of Indian
remote Sensing.

11 | P a g e
Learning Objectives
After studying this unit, you will be able to know the following.
• Development of Remote Sensing
• Milestones of Remote Sensing
• History of India’s Space Programme

2.1 Development of Remote Sensing


The first black and white aerial photograph was taken in the year 1860 by
James Wallace Black from the height of 2,000 feet in Boston using a hot air
balloon and during the year 1861 colour photographs were taken by James
Clerk Maxwell. In 1897, Alfred Nobel became the first human being in the
world to succeed in capturing an aerial photo with the help of a rocket
mounted camera. In 1859 Gaspard Tournachon took an oblique photograph
of a small village near Paris from a balloon. With this picture, the era of earth
observation and remote sensing had started. His example was soon
followed by other people all over the world. During the period 1900-1914,
continuous development with respect to the film, the mounting base of
cameras, height from the earth’s surface etc. took place in the field of aerial
photography for improving the quality of photographs and the ground
coverage.
In the early 20th-century, remote sensing images were captured using
kites and even with cameras mounted on pigeons. In Europe, carrier
pigeons were already being used in military communication and aerial
reconnaissance was an appealing application. Small lightweight cameras
were attached to the birds and photos were automatically taken using a
timing mechanism. Pigeon photography was successful but didn't become
widely used due to the rapid development of aviation technology. In 1906
professional photographer George Lawrence used a string of kites to raise
49-pound camera 1000 feet in the air to capture the devastation of the
earthquake in San Francisco. The steel kite line carried an electric current
to remotely trigger the shutter. The famous photograph "San Francisco in
Ruins" was taken 6 weeks after the earthquake and subsequent fires in San
Francisco.
The first aerial photographs taken from an aeroplane were in 1909, by
Wilbur Wright. By the first World War, cameras mounted on aeroplanes
provided aerial views of large surface areas that proved invaluable in
military reconnaissance. By World War II aeroplanes were commonly
equipped with cameras, in fact, allied forces recruited a team of experts to

12 | P a g e
review millions of stereoscopic aerial images to detect hidden Nazi rocket
bases. During the Cold War, the use of aerial reconnaissance increased
with U-2 aircraft flying at ultra-high altitudes (70,000 ft) to capture imagery.
Aerial photography grew quickly following the war and was soon employed
for a variety of purposes. These new photographs provided people with a
realistic vantage of the world few had seen before. Aerial photography was
a much faster and cheaper way to produce maps compared to traditional
ground surveys.
In the United States, aerial photography was used for farm programs
beginning in the Dust Bowl Era of the 1930s with the passing of the
Agricultural Adjustment Act. The agency is then known as the Agricultural
Adjustment Administration (AAA), began its aerial photography program in
1937 and by 1941 the AAA has flown and acquired aerial photographs of
more than 90% of the agricultural land in the US. The Agriculture
Department's aerial photography program became tool for conservation
and land planning as well as an instrument of fair and accurate
measurement. The agricultural agencies have since been consolidated and
are now known as Farm Service Agency (FSA). The FSA is still responsible
for aerial imagery programs in the US. Aerial photography remained the
primary tool for depicting the Earth's surface until the early 1960s.

Landsat 1

13 | P a g e
The development of satellite-based remote sensing began with the "space
race" in the 1950s and 1960s. In 1957 the Soviet Union launched Sputnik
1, the world's first artificial satellite. The United States followed in 1960 with
the successful launch of Explorer 1. The next decades brought about rapid
developments in satellites and imaging technology. The first successful
meteorological satellite (TIROS-1) was launched in 1960. In 1972 Landsat
1, the first earth resource satellite, was launched by the US. The original
goal of the Landsat program was to collect data from the Earth through
remote sensing techniques. Landsat 1 was originally named Earth
Resources Technology Satellite 1 and was later renamed Landsat 1. The
Landsat program has continued for 45 years with Landsat 8 launched in
2013.
Since the launch of Sputnik in 1957, thousands of satellites have been
launched. There is a myriad of commercial and government satellites in
operation today, many of which are used for remote sensing applications.
There are currently over 3,600 satellites orbiting the Earth, but only
approximately 1400 are operational. Of these satellites, well over 100 are
Earth-observing satellites that carry a variety of different sensors to
measure and capture data about the Earth. These satellites are often
launched by governments to monitor Earth's resources, but private
commercial companies are becoming increasingly active in launching earth-
observing satellites as well.

2.2 Milestones of Remote Sensing Development

Sl. No. Year History of Development of Remote Sensing

1 1830 Invention of Stereoscopes.

Balloonists took pictures of the ground using


2 1840
newlyinvented photo-camera.

The first aerial photograph was claimed to have


been taken by Felix Tournachon, known as
3 1858
Nadar,from a tethered balloon over the Bievre
Valley in France.

The most novel platform at the end of the last


4 1903 century is the famed pigeon fleet that operated
as a novelty in Europe.

14 | P a g e
Hoffman first to sense from an aircraft in
5 1919
thermal IR.

6 1920 First books on aerial photo interpretation.

Stevens’s development of an IR sensitive film


7 1931
(Black &White).

Identification of V-1 rockets, radar, water depth


8 1940
for amphibious landings etc.

Kodak patents the first false colour IR sensitive


9 1942
film.

Advances in sensor technology move into multi-


10 1950 spectral range, color-infra red photography
recognized for non-military applications.

Westinghouse, under sponsorship from USAF,


11 1954 develops the first side-looking airborne radar
(SLAR) system.

With the advent of Sputnik, the possibility of


putting film cameras on orbiting spacecraft was
realized. The first cosmonauts and astronauts
12 1957
carried cameras to document selected regions
and targets of opportunity as they
circumnavigated the globe. (Russia launched).

TIROS-1 launched as the first meteorological


13 1960
satellite.

Nimbus Weather Satellite Program begins with


14 1964
the launch of Nimbus-1.

Operational system for collecting information


about the earth on a repetitive schedule remote
sensing matured in the 1970s when instruments
15 1970 were flown on Skylab (and later, the Space
Shuttle) and on Landsat, the first satellite
dedicated specifically to monitoring land and
ocean surfaces to map natural and cultural

15 | P a g e
resources.

Early prototypes for the TIROS (Television and


16 1958 Infrared Observation Satellite) and Vanguard
were created.

China launched its first communications


17 1970
satellite.

Launch of ERTS-1, the first Earth Resources


Technology Satellite (later renamed Landsat-1).
18 1972
Carried return beam vidicon RBV) and
multispectral scanner (MSS).

India built its first satellite, Aryabhata, which


19 1975
was launched by USSR.

Launch of Meteosat-1, the first in a long series


20 1977
of European weather satellites

Sea sat, Nimbus-7 with TOMS and CZCS.


21 1978 TOVS (TIROS Operational Vertical Sounder
was operational in 1978.

A radar imaging system was the main sensor on


Sea Sat and, going into the 1980s, a variety of
22 1980 specialized sensors-CZCS, HCMM, and
AVHRR among others-were placed in orbit
primarily as research or feasibility programs.

The first non-military radar system was JPL’s


Shuttle Imaging Radar (SIR-A) on the Space
23 1982
Shuttle in 1982 and Indian National Satellite
(INSAT-1A) was launched.

Launch of SPOT-1. Spot transmitted


24 1986 multispectral data at 20m resolution and
panchromatic data at 10m resolution.

25 1995 Launch of OrbView-1 the world’s first


commercial imaging satellite, the launch of

16 | P a g e
ERS-2, Radarsat-1 and IRS-1C.

Launch of Landsat 7, IKONOS 1m resolution,


26 1999
QuickSCAT, CBERS-1, Terra etc.

Shuttle SRTM Mission, launched Tsinghua-1,


27 2000
Eros A1 (1m resolution).

LiDAR is an active remote sensing technology


that makes possible the characterization of the
28 2000
forest vertical structure on scales ranging from
an individual tree to the world’s forests.

29 2001 Launch of Quick bird, 61cm resolution. GSAT-1.

Republic of China Satellite (RocSat2) launched,


30 2004
ahigh resolution 2m PAN, 8m RGB.

Launch of Rapid Eye, a constellation of fiveinterlinked


31 2007
high-resolution satellites.

Cartosat-2A, carries PAN capable of capturing


32 2008 black and white pictures, IMS-1 and
Chandrayan-1.

RISAT-2 (Radar Imaging Satellite) to monitor


33 2009
India’s borders.

34 2010 StudSat (Student Satellite) and GSAT-5P.

Resource Sat-2, Youthsat, GSAT-8 and


35 2011
12MeghatTropiques, SRMSat

36 2012 RISAT-1

SARALIRNSS-1, INSAT-3D, Mars Orbiter


37 2013
Mission(MOM)

38 2015 IRNSS-1D, GSAT-6, AstrisatGSAT-15

39 2016 IRNSS-1G, Cartosat-2C, Swayam-1, Pratham


(to count electrons in the earth’s atmosphere),

17 | P a g e
ScatSat,

INS-1A ISRO Nano Satellite carried SBR and


40 2017
SEUM, IRNSS-1H

41 2018 MicroSat-TD (Microsatellite), HySISExceedSat-1

Microsat-R (military use), KalamSat-V,


42 2019
EMISAT,Chandrayan-2, CartoSat-3 etc.

2.3 History of India’s Space Programme


IRS-1A, the first of the series of indigenous state-of-art operating remote
sensing satellites, was successfully launched into a polar sun-synchronous
orbit on March 17, 1988, from the Soviet Cosmodrome at Baikonur. The
successful launch of IRS-1A was one of the proudest moments for the entire
country, which depicted the maturity of the satellite to address the various
requirements for managing the natural resources of the nation. Its LISS-I
had a spatial resolution of 72.5 meters with a swath of 148 km on the
ground. LISS-II had two separate imaging sensors, LISS-II A and LISS-II B,
with a spatial resolution of 36.25 meters each and mounted on the
spacecraft in such a way to provide a composite swath of 146.98 km on the
ground. The IRS-1A satellite, with its LISS-I and LISS-II sensors, quickly
enabled India to map, monitor and manage its natural resources at coarse
and medium spatial resolutions. The operational availability of data products
to the user organisations further strengthened the operationalization of
remote sensing applications and management in the country.
IRS-1A was followed by the launch of IRS-1B, an identical satellite, in 1991.
These two satellites in the IRS series have been the workhorses for
generating natural resources information in a variety of application areas,
such as agriculture, forestry, geology and hydrology etc. From then
onwards, a series of IRS spacecrafts were launched with enhanced
capabilities in payloads and satellite platforms. The whole gamut of the
activities from the evolution of IRS missions by identifying the user
requirements to utilisation of data from these missions by user agencies is
monitored by National Natural Resources Management System (NNRMS),
which is the nodal agency for natural resources management and
infrastructure development using remote sensing data in the country. Apart
from meeting the general requirements, the definition of IRS missions based
on specific thematic applications like natural resources monitoring, ocean

18 | P a g e
and atmospheric studies and cartographic applications resulted in the
realisation of theme-based satellite series, namely, (i) Land/water resources
applications (RESOURCESAT series and RISAT series); (ii)
Ocean/atmospheric studies (OCEANSAT series, INSAT-VHRR, INSAT-3D,
Megha-Tropiques and SARAL); and (iii) Large scale mapping applications
(CARTOSAT series).
IRS-1A development was a major milestone in the IRS programme. On this
occasion of 30 years of IRS-1A and the fruitful journey of the Indian remote
sensing programme, it is important to look back at the achievements of the
Indian Space Programme, particularly in remote sensing applications,
wherein India has become a role model for the rest to follow. Significant
progress continued in building and launching the state-of-the-art Indian
Remote Sensing Satellite as well as in operational utilisation of the data in
various applications to the nation.
Today, the array of Indian Earth Observation (EO) Satellites with imaging
capabilities in visible, infrared, thermal and microwave regions of the
electromagnetic spectrum, including hyperspectral sensors, have helped
the country in realising major operational applications. The imaging sensors
have been providing spatial resolution ranging from 1 km to better than 1m;
repeat observation (temporal imaging) from 22 days to every 15 minutes
and radiometric ranging from 7 bit to 12 bit, which has significantly helped
in several applications at the national level. In the coming years, the Indian
EO satellites are heading towards further strengthened and improved
technologies, taking cognizance of the learnings/ achievements made in the
years.
2.3.1 Communication Satellites
The Indian National Satellite (INSAT) system is one of the largest domestic
communication satellite systems in the Asia-Pacific region with nine
operational communication satellites placed in Geo-stationary orbit.
Established in 1983 with the commissioning of INSAT-1B, it initiated a major
revolution in India’s communications sector and sustained the same later.
GSAT-17 joins the constellation of INSAT System consisting of 15
operational satellites, namely - INSAT-3A, 3C, 4A, 4B, 4CR and GSAT- 6,
7, 8, 9, 10, 12, 14, 15, 16 and 18. The INSAT system with more than 200
transponders in the C, Extended C and Ku-bands provides services to
telecommunications, television broadcasting, satellite news gathering,
societal applications, weather forecasting, disaster warning and Search and
Rescue operations.

19 | P a g e
2.3.2 Earth Observation Satellites
Starting with IRS-1A in 1988, ISRO has launched many operational remote
sensing satellites. Today, India has one of the largest constellations of
remote sensing satellites in operation. Currently thirteen operational
satellites are in Sun-synchronous orbit – RESOURCESAT-1, 2, 2A
CARTOSAT-1, 2, 2A, 2B, RISAT-1 and 2, OCEANSAT-2, Megha-
Tropiques, SARAL and SCATSAT-1, and four in Geostationary orbit-
INSAT-3D, Kalpana & INSAT 3A, INSAT -3DR. A variety of instruments
have been flown on board these satellites to provide necessary data in a
diversified spatial, spectral and temporal resolutions to cater to different
user requirements in the country and for global usage. The data from these
satellites are used for several applications covering agriculture, water
resources, urban planning, rural development, mineral prospecting,
environment, forestry, ocean resources and disaster management.
2.3.3 Satellite Navigation
Satellite Navigation service is an emerging satellite-based system with
commercial and strategic applications. ISRO is committed to providing
satellite-based Navigation services to meet the emerging demands of Civil
Aviation requirements and to meet the user requirements of the positioning,
navigation and timing based on the independent satellite navigation system.
To meet the Civil Aviation requirements, ISRO is working jointly with the
Airport Authority of India (AAI) in establishing the GPS Aided Geo
Augmented Navigation (GAGAN) system. To meet the user requirements of
the positioning, navigation and timing services based on the indigenous
system, ISRO is establishing a regional satellite navigation system called
Indian Regional Navigation Satellite System (IRNSS).
GPS Aided GEO Augmented Navigation (GAGAN)

This is a Satellite-Based Augmentation System (SBAS) implemented jointly


with the Airport Authority of India (AAI). The main objectives of GAGAN are
to provide Satellite-based Navigation services with the accuracy and
integrity required for civil aviation applications and to provide Air Traffic
Management over Indian Airspace.
Indian Regional Navigation Satellite System (IRNSS): NavIC

This is an independent Indian Satellite based positioning system for critical


national applications. The main objective is to provide Reliable Position,
Navigation and Timing services over India and its neighbourhood, to provide
good accuracy to the user.

20 | P a g e
2.3.4 Standard Positioning Service (SPS) Restricted Service (RS)
ISRO has built a total of nine satellites in the IRNSS series; of which eight
are currently in orbit Three of these satellites are in geostationary orbit
(GEO) while the remaining are in geosynchronous orbits (GSO) that
maintain an inclination of 29° to the equatorial plane. The IRNSS
constellation was named “NavIC” (Navigation with Indian Constellation) by
the Honourable Prime Minister, Mr. Narendra Modi and dedicated to the
nation when the successful launch of the IRNSS-1G satellite. The eight
operational satellites in the IRNSS series, namely IRNSS-1A, 1B, 1C, 1D,
1E, 1F, 1G and 1I were launched on Jul 02, 2013; Apr 04, 2014; Oct 16,
2014; Mar 28, 2015; Jan 20, 2016; Mar 10, 2016, Apr 28, 2016; and Apr 12,
2018, respectively. The PSLV-39 / IRNSS-1H was unsuccessful; the
satellite could not reach orbit.
Space Science & Exploration
Indian space programme encompasses research in areas like astronomy,
astrophysics, planetary and earth sciences, atmospheric sciences, and
theoretical physics. Balloons, sounding rockets, space platforms and
ground-based facilities support these research efforts. A series of sounding
rockets are available for atmospheric experiments. Several scientific
instruments have been flown on satellites especially to direct celestial X-ray
and gamma-ray bursts.
AstroSat
AstroSat is the first dedicated Indian astronomy mission aimed at studying
celestial sources in X-ray, optical and UV spectral bands simultaneously.
The payloads cover the energy bands of Ultraviolet, limited optical and X-
ray regime (0.3 keV to 100keV). One of the unique features of AstroSat
mission is that it enables the simultaneous multi-wavelength observations
of various astronomical objects with a single satellite. AstroSat with a lift-off
mass of 1515 kg was launched on September 28, 2015, into a 650 km orbit
inclined at an angle of 6° to the equator by PSLV-C30 from Satish Dhawan
Space Centre, Sriharikota. The minimum useful life of the AstroSat mission
is expected to be 5 years.
2.3.5 Mars Orbiter Mission
Mars Orbiter Mission is ISRO’s first interplanetary mission to planet Mars
with an orbiter craft designed to orbit Mars in an elliptical orbit of 372 km by
80,000 km. Mars Orbiter mission can be termed as a challenging
technological mission and a science mission considering the critical mission
operations and stringent requirements on propulsion, communications, and

21 | P a g e
other bus systems of the spacecraft. The primary driving technological
objective of the mission is to design and realize a spacecraft with a capability
to perform Earth Bound Manoeuvre (EBM), Martian Transfer Trajectory
(MTT) and Mars Orbit Insertion (MOI) phases and the related deep space
mission planning and communication management at nearly 400 million
Km. Autonomous fault detection and recovery also becomes vital for the
mission.
Chandrayaan-1
Chandrayaan-1, India's first mission to Moon, was launched successfully on
October 22, 2008, from SDSC SHAR, Sriharikota. The spacecraft was
orbiting around the Moon at a height of 100 km from the lunar surface for
chemical, mineralogical, and photo-geologic mapping of the Moon. The
spacecraft carried 11 scientific instruments built in India, USA, UK,
Germany, Sweden, and Bulgaria.
Chandrayaan-2
Chandrayaan-2 will be an advanced version of the previous Chandrayaan-
1 mission to Moon.Chandrayaan-2 is configured as a two-module system
comprising of an Orbiter Craft module (OC) and a Lander Craft module (LC)
carrying the Rover developed by ISRO.
Small Satellites
The small satellite project is envisaged to provide a platform for stand- alone
payloads for earth imaging and science missions within a quick turnaround
time. For making the versatile platform for different kinds of payloads, two
kinds of buses have been configured and developed.

Indian Mini Satellite -1 (IMS-1)


IMS-1 bus has been developed as a versatile bus of 100 kg class which
includes a payload capability of around 30 kg. The bus has been developed
using various miniaturization techniques. The first mission of the IMS-1
series was launched successfully on April 28th, 2008, as a co- passenger
along with Cartosat 2A. YouthSat is the second mission in this series and
was launched successfully along with Resourcesat 2 on 20th April 2011.
Indian Mini Satellite -2 (IMS-2) Bus
IMS-2 Bus is evolved as a standard bus of 400 kg class which includes a
payload capability of around 200kg. IMS-2 development is an important
milestone as it is envisaged to be a work horse for different types of remote
sensing applications. The first mission of IMS-2 is SARAL. SARAL is a co-

22 | P a g e
operative mission between ISRO and CNES with payloads from CNES and
spacecraft bus from ISRO.
University / Academic Institute Satellites
ISRO has influenced educational institutions by its activities like making
satellites for communication, remote sensing, and astronomy. The launch
of Chandrayaan-1 increased the interest of universities and institutions in
making experimental student satellites. Capable Universities and
institutions can venture into space technology on-orbit with guidance and
support from ISRO in the following ways.
Development of Payload (by Universities/Institutions)
Every satellite carries a payload that performs the intended function to
achieve the mission goal and the main bus that supports the payload
function. The Development of payloads may comprise of detectors,
electronics, and associated algorithms, which can be an experimental
piggyback payload on the ISRO’s ongoing (Small or operational) satellite
projects. Design and development of detectors, payload electronics, and
associated algorithms/experiments that enhance the application of space
services to mankind is a continuing R&D activity in several educational
institutions all over the world. Educational institutions can propose the
payloads developed by them to be flown on ISRO’s small satellites.
Satellite Design & Fabrication by Universities/Institutions
Under this option, Universities have to design, fabricate, test the satellite
Bus & Payload and deliver the integrated spacecraft for launch. Technical
guidance in designing, fabrication and testing will be provided by ISRO.
Some critical materials for the space mission also will be provided by ISRO.
The designs and test results will be reviewed by the ISRO team. Under this
option, more than one University/Institution may participate. One among
them will be the focal point for the ISRO. After launch, the collected data will
be archived and disseminated by the university/Institution(s).
2.3.6 Satellite Launchers
Launchers or Launch Vehicles are used to carry spacecraft to space. India
has two operational launchers: Polar Satellite Launch Vehicle (PSLV) and
Geosynchronous Satellite Launch Vehicle (GSLV). GSLV with indigenous
Cryogenic Upper Stage has enabled the launching of up to 2 tonne class of
communication satellites. The next variant of GSLV is GSLV Mk III, with an
indigenous high thrust cryogenic engine and stage, having the capability of
launching 4 tonne class of communication satellites.

23 | P a g e
Satellite Launchers

To achieve high accuracy in placing satellites into their orbits, a combination


of accuracy, efficiency, power and immaculate planning is required. ISRO's
Launch Vehicle Programme spans numerous centres and employs over
5,000 people. Vikram Sarabhai Space Centre, located in
Thiruvananthapuram, is responsible for the design and development of
launch vehicles. Liquid Propulsion Systems Centre and ISRO Propulsion
Complex, located at Valiamala and Mahendragiri respectively, develop the
liquid and cryogenic stages for these launch vehicles. Satish Dhawan Space
Centre, SHAR, is the spaceport of India and is responsible for the
integration of launchers. It houses two operational launch pads from where
all GSLV and PSLV flights take place. Polar Satellite Launch Vehicle was
developed to launch Low Earth Orbit satellites into Polar and Sun
Synchronous Orbits. It has since proved its versatility by launching
Geosynchronous, Lunar, and Interplanetary spacecraft successfully.
Geosynchronous Satellite Launch Vehicle was developed to launch the
heavier INSAT class of geosynchronous satellites into orbit. In its third and
final stage, GSLV uses the indigenously developed Cryogenic Upper Stage.
ISRO launches smaller rockets from the Rohini series on suborbital and
atmospheric flights for aeronomy and meteorological studies. ATV, ISRO's
heaviest sounding rocket, can be used for microgravity experiments and for
precursor experiments to characterise new technologies.

Let Us Sum Up
Landsat: A series of unmanned NASA satellites that orbit the Earth and
collect multispectral imagery in various visible and infrared bands. Landsat
was an open Earth resources programme that continues to through more
advanced Landsat’s and other satellite resource monitoring programmes.

24 | P a g e
Sentinel Satellite is a family of satellites developed by the European Space
Agency (ESA) under the Copernicus Programme. The Copernicus
Programme is the Earth Observation Programme managed by the ESA,
launched in 1998. RESOURCESAT-2A was launched on December 7,
2016.

Glossary
Geosynchronous orbit is a term used to describe a space. The
geostationary or geosynchronous orbit is one in which the time it takes the
satellite to complete one revolution is equal to the time it takes the Earth to
circle once around its polar axis.
Sun-synchronous Orbit is the orbit where the satellite travels from the north
to the south poles as the Earth rotates below it.
Heliocentric theory argues that the sun is the central body of the solar
system and perhaps of the universe. Everything else (planets and their
satellites, asteroids, comets, etc.) revolves around it.

Check Your Progress


1. How many operational satellites are in sun-synchronous orbit for
India?
13 operational satellites
2. In which year the term remote sensing was coined?
1960
3. Name the sensors used in Landsat 8?
Operational Land Imager (OLI) and the Thermal Infrared Sensor
(TIRS).

Suggested Readings
1. Campbell J., 1989. Introduction to Remote Sensing, Guilford, New
York.
2. Thomas M., Lillesand, Ralph, Kefer, W., 1994. Remote Sensing and
Image Interpretation, John Wiley & Sons, New York.
3. Sarkar, A. (2015): Practical Geography: A Systematic Approach.
Orient Black Swan Private Ltd., New Delhi.
4. Anji Reddy, M. (2008): Textbook of Remote Sensing and Geographic
Information System, B.S. Publication, Hyderabad.

5. Campbell, J. B., (2007): Introduction to Remote Sensing, Guildford

25 | P a g e
Press.
6. Chauniyal, D.D., (2010): Sudur Samvedanevam Bhogolik Suchana
Pranali (Hindi), Sharda Pustak Bhawan, Allahabad. 34.
7. Jensen, J.R. (2007): Remote Sensing of the Environment: An Earth
Resource Perspective, Prentice-Hall Inc., New Jersey.
8. Joseph, G. (2005): Fundamentals of Remote Sensing, United Press
India.
9. Campbell J., 1989. Introduction to Remote Sensing, Guilford, New
York.

10. Thomas M., Lillesand, Ralph, Kefer, W., 1994. Remote Sensing and
Image Interpretation, John Wiley & sons, New York.

Web Sources
1. https://gisgeography.com/landsat/
2. https://landsat.gsfc.nasa.gov/
3. https://sentinel.esa.int/web/sentinel/missions/sentinel-1

26 | P a g e
BLOCK 2

Remote Sensing Processes

Unit 3: Introduction to Remote Sensing

Unit 4: Sources of Energy and Electromagnetic


Radiations (EMR)

Unit 5: Electromagnetic Spectrum, Atmospheric


Windows

Unit 6: Energy Interaction with Atmosphere and Earth

27 | P a g e
Unit 3
Introduction to Remote Sensing
Structure
Overview
Learning Objectives
3.1 Introduction to Remote Sensing

3.1.1 Definitions of Remote Sensing


3.1.2 Scope of Remote Sensing
3.1.3 Overview of the Remote Sensing Process

Let us Sum Up
Glossary
Check Your Progress
Suggested Readings
Web Sources

Overview
The term "remote sensing" refers to gathering data from distance. Remote
sensing is an exciting field to work in. It also makes use of any or all of the
electromagnetic spectrum. The electromagnetic energy reflected or emitted
by the earth's surface is recorded. Understanding the interactions of energy
with various aspects of the earth's surface helps us in analysing the
remotely sensed image.
Learning Objectives
After studying this unit, you would be able to understand the following.

• Definition and scope of remote sensing


• Remote sensing process
• Gain knowledge, concepts, and applications

3.1 Introduction to Remote Sensing


Remote sensing techniques allow taking images of the earth’s surface in
various wavelength regions of the electromagnetic spectrum (EMS). One of
the major characteristics of a remotely sensed image is the wavelength

28 | P a g e
region it represents in the EMS. Some of the images represent reflected
solar radiation in the visible and the near-infrared regions of the
electromagnetic spectrum, others are the measurements of the energy
emitted by the earth’s surface itself. The energy measured in the microwave
region is the measure of relative return from the earth’s surface, where the
energy is transmitted from the vehicle itself. This is known as active remote
sensing since the energy source is provided by the remote sensing platform.
Whereas the systems where the remote sensing measurements depend
upon the external energy source, such as the sun are referred red to as
passive remote sensing systems.
Remote sensing also gives information about the earth’s objects from a
distance without any physical contact with the earth’s surface. Simply we
can define Remote sensing as a technique to collect and interpret the
information about an object, area, incidents (Disasters) or changes (land
use) etc. Without being in physical contact with the earth’s surface. Aircraft,
satellites, and drones are the major platforms for remote sensing of the
earth and its natural resources.
3.1.1 Definitions of Remote Sensing
• Remote sensing is the science of data collection regarding an object or
phenomena without physical contact with the object.
• Remote sensing is the science and art of obtaining information about an
object, area, or phenomenon through the analysis of data acquired by a
device that is not in contact with the object, area, or phenomenon under
investigation.

• Remote sensing is the art or science of telling something about an object


without touching it.
• Remote sensing is the acquisition of physical data of an object without
touch or contact.
• Remote sensing is the science of deriving information about an object
from measurements made at a distance from the object, i.e., without
meeting it (fig.3.1).
3.1.2 Scope of Remote Sensing
• Remote Sensing is used as a tool in many disciplines.

• The development of remote sensing and its wider application in


various disciplines have created a wealth of information.
• Satellite images are permanent records, providing useful information

29 | P a g e
in various wavebands.
• Large area coverage enables regional surveys on a variety of
themes in various wave bands.
• Repetitive coverage allows monitoring of dynamic themes like water,
agriculture etc.
• Easy data acquisition at different scales and resolutions.

Fig.3.1 Satellite remote sensing system with five components


• A single remotely sensed image can be analyzed and interpreted for
different purposes and applications.
• The images are analyzed in the laboratory thus reducing the amount
of field work and the analysis from remote sensing data
therefore is cost effective.
• The technology of modern remote sensing began with the invention
of the camera more than 150 years ago.
• The first rather primitive photograph was taken as “stills” on the
ground.
• In 1840`s pictures were taken from cameras from balloons for the
purpose of topographic mapping.
• In the I World War cameras mounted on airplanes provided aerial
views of large surface areas.

• In 1946 V -2 rockets acquired from Germany after World War II were


launched to high altitudes.
3.1.3 Overview of the Remote Sensing Process
The science of collecting information about an object or phenomenon
without coming into close contact with it is known as remote sensing. It helps

30 | P a g e
in surveying, gathering, and retrieving in difficult-to-access locations. It
featured a complicated interface, but it was less reliable. It's not as well
suited to interdepartmental communication. Remote sensing is the process
of extracting information about the Earth's land and ocean surfaces from
images taken from above, using electromagnetic radiation reflected or
emitted from the Earth's surface in one or many portions of the
electromagnetic spectrum.
Geology, forestry, soil science, geography, and urban planning are just a
few of the disciplines that study physical objects. Sensor data is created
when an instrument (such as a camera or radar) records electromagnetic
radiation emitted or reflected from the ground while seeing physical objects.
Because of their unfamiliar overhead perspective, unique resolutions, and
utilisation of spectral regions outside the visible spectrum, sensor data might
appear abstract and strange to many people. As a result, successful sensor
data uses data analysis and interpretation to convert data into information
that can be used to solve real problems like landfill sitting or mineral deposit
searching. These interpretations result in extracted information, which is
made up of sensor data modifications that expose certain types of data. A
more realistic perspective shows how the same sensor data may be
analysed from several perspectives to provide different interpretations.

Fig.3.2 Remote Sensing Processes

Finally, the applications, in which the analysed remote sensing data can be
integrated with other data to solve a specific practical problem, such as land

31 | P a g e
use planning, mineral prospecting, or water quality mapping. Applications
are implemented in the field of GIS when digital remote sensing data is
merged with other geospatial data. For example, remote sensing data may
provide accurate land-use information that can be combined with soil,
geologic, transportation, and other information to guide the sitting of a new
landfill.

Let us Sum Up
The science of collecting information about an object or phenomenon
without coming into close contact with it is known as remote sensing. It helps
in surveying, gathering, and retrieving in difficult-to-access locations. It
featured a complicated interface, but it was less reliable. Remote sensing is
the process of extracting information about the Earth's land and ocean
surfaces from images taken from above, using electromagnetic radiation
reflected or emitted from the Earth's surface in one or many portions of the
electromagnetic spectrum.

Glossary
Geosynchronous orbit is a term used to describe a space. The Remote
Sensing: Remote sensing is the science of data collection regarding an
object or phenomena without physical contact with the object.

Check Your Progress


1. Write a note on Basics of Remote Sensing
Detection and discrimination of objects or surface features means detecting
and recording of radiant energy reflected or emitted by objects or surface
material. Different objects return different amounts of energy in different
bands of the electromagnetic spectrum, incident upon it. This depends on
the property of material (structural, chemical, and physical), surface
roughness, angle of incidence, intensity, and wavelength of radiant energy.
Remote Sensing is basically a multi-disciplinary science which includes a
combination of various disciplines such as optics, spectroscopy,
photography, computer, electronics and telecommunication, satellite
launching etc.

Suggested Readings
1. Campbell J., 1989. Introduction to Remote Sensing, Guilford, New
York.
2. Thomas M., Lillesand, Ralph, Kefer, W., 1994. Remote Sensing and
Image Interpretation, John Wiley & Sons, New York.

32 | P a g e
3. Sarkar, A. (2015): Practical Geography: A Systematic Approach.
Orient Black Swan Private Ltd., New Delhi.
4. Anji Reddy, M. (2008): Textbook of Remote Sensing and Geographic
Information System, B.S. Publication, Hyderabad.
5. Campbell, J. B., (2007): Introduction to Remote Sensing, Guildford
Press.

6. Chauniyal, D.D., (2010): Sudur Samvedanevam Bhogolik Suchana


Pranali (Hindi), Sharda Pustak Bhawan, Allahabad. 34.

Web Sources
1. https://gisgeography.com/landsat/

2. https://landsat.gsfc.nasa.gov/
3. https://sentinel.esa.int/web/sentinel/missions/sentinel-1

33 | P a g e
Unit 4
Sources of Energy and Electromagnetic
Radiations (EMR)
Structure
Overview
Learning Objectives
4.1 Sources of Energy

4.2 Electromagnetic Radiations


4.2.1 Visible Range
4.2.3 Non-visible Range
Let us Sum Up
Glossary
Check Your Progress
Suggested Readings
Web Sources

Overview

Electromagnetic radiation consists of electromagnetic waves, which are


synchronized oscillations of electric and magnetic fields. Electromagnetic
radiation or electromagnetic waves are created due to periodic change of
electric or magnetic field. Depending on how this periodic change occurs
and the power generated, different wavelengths of electromagnetic
spectrum are produced. In a vacuum, electromagnetic waves travel at
the speed of light, commonly denoted c. In homogeneous, isotropic media,
the oscillations of the two fields are perpendicular to each other and
perpendicular to the direction of energy and wave propagation, forming
a transverse wave. The position of an electromagnetic wave within
the electromagnetic spectrum can be characterized by either
its frequency of oscillation or its wavelength. Electromagnetic waves of
different frequency are called by different names since they have different
sources and effects on matter.

34 | P a g e
Learning Objectives
After Learning this lesson, you will be able to:
• Know the Concepts of Electro-Magnetic Radiation

• Calculate Frequency or Wavelength for Electromagnetic Radiation

4.1 Sources of Energy

Electromagnetic energy is used to power the modern world. Without


advanced electromagnetic technology, cell phones and computers,
Bluetooth, GPS systems, satellite imagery, and scientific understanding of
our planet and space as we know it would not be viable. As technological
applications and appliances continue to advance, mutual reliance on and
greater understanding of electromagnetic technology is more critical than
ever. Electromagnetic energy is radiant energy that travels in waves at the
speed of light. It can also be described as radiant energy, electromagnetic
radiation, electromagnetic waves, light, or the movement of radiation.
Electromagnetic radiation can transfer heat. Electromagnetic waves carry
the heat, energy, or light waves through a vacuum or a medium from one
point to another. The act of doing this is considered electromagnetic
energy. Electromagnetic radiation was discovered by James Clerk
Maxwell, a 19th-century physicist whose findings greatly influenced what
would become known as quantum mechanics. When it comes to how it
works, we can think of electromagnetic energy or radiation as working
similarly to a regular ocean wave. In this metaphor, the radiation is the
water. Electromagnetic waves are ocean waves, and the electromagnetic
energy is produced from the waves carrying water from the middle of the
ocean to the shore.

4.2 Electromagnetic Radiations

Electromagnetic radiation is a carrier of electromagnetic energy by


transmitting field through space or matter. Electromagnetic radiation can be
described in terms of a stream of photons, each traveling in a wave-like
pattern, moving at the speed of light and carrying some amount of energy.
It was pointed out that the only difference between radio waves, visible light,
and gamma-rays is the energy of the photons. Radio waves have photons
with low energies, microwaves have a little more energy than radio waves,
infrared has still more than visible, ultraviolet, X-rays, and gamma-rays. The
amount of energy a photon makes it sometimes behave more like a wave
and sometimes more like a particle. This is called the “wave-particle duality"
of light.

35 | P a g e
Fig. 4.1 Electro Magnetic Radiation

Fig. 4.2 Speed of Light

The speed of light is equal to 299,792,458 m/s (186,212 miles/second).


EMR is a dynamic form of energy that propagates as wave motion at a
velocity of c = 3 x 1010 cm/sec. The parameters that characterize a wave
motion are wavelength, frequency and velocity. The relationship between
the above is Electromagnetic wave. It has two components, Electric field E
and Magnetic field M, both perpendicular to the direction of propagation.
Electromagnetic energy radiates in accordance with the basic wave theory.
This theory describes EM energy as travelling in a harmonic sinusoidal
fashion at the velocity of light. Although many characteristics of EM energy

36 | P a g e
are easily described by wave theory, another theory known as particle
theory offers insight into how electromagnetic energy interacts with matter.
An electric field accelerates an atomic particle, such as an electron, forcing
it to travel, resulting in EM radiation. The motion causes oscillating electric
and magnetic fields, which propagate in bundles of light energy called a
photon at right angles to each other. Photons move at the fastest
conceivable speed in the universe, which is 186,282 miles per second
(299,792,458 meters per second) in a vacuum, commonly known as the
speed of light. Frequency, wavelength, and energy are all properties of
waves.
The application of electromagnetic radiation in remote sensing allows for
data transmission. EMR is a type of energy that shows itself in the form of
visible effects when it interacts with matter. All signals collected by the
majority of remote sensing sensors originate from electromagnetic
radiation. Depending on the sensor's features, the source of this energy
changes. The remote sensing system's components are linked by this
energy.
For remote sensing, two features of electromagnetic radiation are very
significant. These are the wavelength and frequency. The wavelength
(lambda λ) is the distance between successive wave crests and measures
the length of one wave cycle. The number of cycles of a wave passing a
fixed point per unit of time is referred to as frequency (v).
c =λν
Where,
c is the speed of light
λ is the wavelength
ν is the frequency

Fig. 4.3 Electromagnetic Spectrum

37 | P a g e
4.2.1 Visible Range
The visible portion of the electromagnetic spectrum is small since the
spectral sensitivity of the human eye is only about 0.4 to 0.7 µm. Film and
photo detectors are used to capture these images. Blue (0.4-0.5µm), green
(0.5-0.6µm) and red (0.6-0.7µm) are the wavelengths that fall within this
visual range.
4.2.3 Non-visible Range
• Gamma-ray: The wavelength in this region is less than 0.03 nm.
The upper atmosphere absorbs all the incoming radiation from this
region.
• X-ray: This region has a wavelength range of 0.03 to 3.0 nm. The
atmosphere absorbs these rays as well.

• Ultraviolet: The wavelength extends from 0.03 to 0.4 µm in the


ultraviolet region. Ozone in the upper atmosphere entirely absorbs
incoming wavelengths less than 0.3µm. It causes fluorescence and it
has applications in geology and vegetation.
• Infra-Red: There are three logical zones in this spectrum.
• Near Infra-Red (NIR)

• Reflected Infra-Red / Mid Infra-Red (MIR)


• Thermal Infra-Red (TIR)
• Near Infra-Red: The interaction of this region with matter varies with
wavelength. This wavelength is categorized between 0.7 to 1.3µm.
• Reflected Infra-Red: This region falls in the wavelength range of 1.3 to
3.0µm. These reflected solar radiation contains no information about the
thermal properties of materials.
• Thermal Infra-Red: TIR region is grouped in the wavelength regions 3
to 5µm and 8 to 14 µm. Optical mechanical scanners and specific
vidicon systems can capture images at these wavelengths. Films are
unable to detect the images, only thermal infrared radiation is directly
related to the sensation of heat within the infrared portion of the
spectrum whereas NIR and MIR are not.
• Microwave: This region has wavelengths ranging from 1mm to 1m.
These are the regions with longer wavelengths that can penetrate
clouds, fog, and rain. One of the active forms of microwave remote
sensing is radar.

38 | P a g e
• Radio wave: This region's wavelength ranges from 10 cm to 100 km.
This is the portion of the electromagnetic spectrum with the longest
wavelengths. Some classified radars with very long wavelengths
operate in this region.

Let us Sum Up
Electromagnetic radiation is a wave of electric and magnetic fields
propagating at the speed of light C through empty space. In this wave the
electric and magnetic fields change their magnitude and direction each
second. An electric field accelerates an atomic particle, such as an electron,
forcing it to travel, resulting in EM radiation. The motion causes oscillating
electric and magnetic fields, which propagate in bundles of light energy
called a photon at right angles to each other. Frequency, wavelength, and
energy are all properties of waves. The application of electromagnetic
radiation in remote sensing allows for data transmission. EMR is a type of
energy that shows itself in the form of visible effects when it interacts with
matter. All signals collected by most remote sensing sensors originate from
electromagnetic radiation. Depending on the sensor's features, the source
of this energy changes. The remote sensing system's components are
linked by this energy.

Glossary
Gamma-ray: The wavelength in this region is less than 0.03 nm.
X-ray: This region has a wavelength range of 0.03 to 3.0 nm. The
atmosphere absorbs these rays as well.
Ultraviolet: The wavelength extends from 0.03 to 0.4 µm in the ultraviolet
region. Ozone in the upper atmosphere entirely absorbs incoming
wavelengths less than 0.3µm. It causes fluorescence and it has applications
in geology and vegetation.
Infra-Red: There are three logical zones in this spectrum.
• Near Infra-Red (NIR)
• Reflected Infra-Red / Mid Infra-Red (MIR)
• Thermal Infra-Red (TIR)
Microwave: This region has wavelengths ranging from 1mm to 1m. These
are the regions with longer wavelengths that can penetrate clouds, fog, and
rain. One of the active forms of microwave remote sensing is radar.

39 | P a g e
Radio wave: This region's wavelength ranges from 10 cm to 100 km. This
is the portion of the electromagnetic spectrum with the longest wavelengths.
Some classified radars with very long wavelengths operate in this region.

Check Your Progress


Describe Electromagnetic Radiation with Diagram
Electromagnetic radiation is a carrier of electromagnetic energy by
transmitting field through space or matter. Electromagnetic radiation can be
described in terms of a stream of photons, each traveling in a wave-like
pattern, moving at the speed of light and carrying some amount of energy.
It was pointed out that the only difference between radio waves, visible light,
and gamma-rays is the energy of the photons. Radio waves have photons
with low energies, microwaves have a little more energy than radio waves,
infrared has still more than visible, ultraviolet, X-rays, and gamma-rays.

Suggested Readings

1. Campbell J., 1989. Introduction to Remote Sensing, Guilford, New


York.
2. Thomas M., Lillesand, Ralph, Kefer, W., 1994. Remote Sensing and
Image Interpretation, John Wiley & Sons, New York.
3. Campbell J., 1989. Introduction to Remote Sensing, Guilford, New
York.
4. Thomas M., Lillesand, Ralph, Kefer, W., 1994. Remote Sensing and
Image Interpretation, John Wiley & Sons, New York.

5. Sarkar, A. (2015): Practical Geography: A Systematic Approach.


Orient Black Swan Private Ltd., New Delhi.
6. Anji Reddy, M. (2008): Textbook of Remote Sensing and Geographic

40 | P a g e
Information System, B.S. Publication, Hyderabad.
7. Campbell, J. B., (2007): Introduction to Remote Sensing, Guildford
Press.
8. Chauniyal, D.D., (2010): Sudur Samvedanevam Bhogolik Suchana
Pranali (Hindi), Sharda Pustak Bhawan, Allahabad. 34.

Web Sources

1. https://gisgeography.com/landsat/

2. https://landsat.gsfc.nasa.gov/
3. https://sentinel.esa.int/web/sentinel/missions/sentinel-1

41 | P a g e
Unit 5
Electromagnetic Spectrum and
Atmospheric Windows
Structure
Overview
Learning Objectives
5.1 Electromagnetic Spectrum
5.1.1 Spectral Reflectance Patterns Visible Region

5.1.2 Spectral Reflectance Patterns Non-Visible Region


5.1.3 Spectral Reflectance Patterns of Visible and Non-Visible Regions
5.2 Atmospheric Windows
Let us Sum Up
Glossary
Check Your Progress
Suggested Readings

Overview
Electromagnetic waves are categorized based on their wavelengths in the
electromagnetic spectrum. The most prevalent unit used to measure
wavelength is micrometer (µm=1×10_6). Every natural and synthetic object
on the earth’s surface and near surface reflects and emits EMR over a range
of wavelengths in its own characteristics way according to its chemical
composition and physical state.

Learning Objectives
After Learning this lesson, you will be able to:
• Know the Concepts of Electro-Magnetic Spectrum

• Identify and Compare the Characteristics of Electromagnetic


Spectrum including Speed, Wavelength and Frequency

42 | P a g e
5.1 Electromagnetic Spectrum
The electromagnetic spectrum covers electromagnetic waves with
frequencies ranging from below one hertz to above 1025 hertz,
corresponding to wavelengths from thousands of kilometers down to a
fraction of the size of an atomic nucleus. This frequency range is divided
into separate bands, and the electromagnetic waves within each frequency
band are called by different names; beginning at the low frequency (long
wavelength) end of the spectrum these are: radio waves, microwaves,
infrared, visible light, ultraviolet, X-rays, and gamma rays at the high-
frequency (short wavelength) end. The electromagnetic waves in each of
these bands have different characteristics, such as how they are produced,
how they interact with matter, and their practical applications. There is no
known limit for long wavelengths, while it is thought that the short
wavelength limit is in the vicinity of the Planck length. Extreme ultraviolet,
soft X-rays, hard X-rays and gamma rays are classified as ionizing radiation
because their photons have enough energy to ionize atoms, causing
chemical reactions. Exposure to ionizing radiation can be a health hazard,
causing radiation sickness, DNA damage and cancer. Radiation of visible
light and longer wavelengths are classified as nonionizing radiation because
they have insufficient energy to cause these effects.
The electromagnetic radiation spectrum of wavelengths from very shorter
gamma rays (10-10m) to longer radio waves (106m). However, in remote
sensing activities, the most useful regions of the EMR are visible (0.4 to 0.7
μm), reflected infrared (0.7 to 3 μm), thermal infra-red (3 to 5 and 8 to 14μm)
and microwave (0.3 to 300cm) regions. Broad divisions of the
electromagnetic spectrum are summarized in the following table.
Table 5.1 Spectral Band with Wavelength

Spectral Band Wavelength (μm)


Gamma Rays < 0.03 nm
X Rays 0.03 - 3 nm
Ultraviolet 3 nm - 0.4 μm
Visible 0.4 μm - 0.7 μm
Infrared 0.7 μm - 1000 μm
Microwave 1 mm - 1 m
Radio >1m

43 | P a g e
Fig. 5.1 Electro – Magnetic Spectrum
5.1.1 Spectral Reflectance Patterns Visible Region
The visible portion of electromagnetic spectrum is small. Spectral sensitivity
of the human eye is only about 0.4to0.7 μm. The visible ranges are blue,
green, red. blue wavelength range is 0.4 to 0.5 μm, and green is 0.5 to 0.6
μm. Red is 0.6 to 0.7 μm.
5.1.2 Spectral Reflectance Patterns Non-Visible Region
Gamma ray region: In this region the wavelength is less than 0.03 nm. The
incoming radiation completely absorbed by the upper atmosphere and is not
available for remote sensing.

X-Ray region: The wavelength range is 0.03 to 3.0 mm. The measurements
shall be signed as absorbed by atmosphere.
Ultraviolet region: The ultraviolet region falls in the wavelength ranges
from 0.03 to 0.4 μm.the incoming wavelengths less than 0.3 μm. Are
completely absorbed by ozone in the upper atmosphere. It causes
fluorescence and is good in some geological and vegetations applications.
Photographic ultraviolet region available here. These are detectable with
film and photo.
Infra-Red region: There are three logical zones in this spectrum.

• Near Infra-Red (NIR)


• Mid Infra-Red (MIR)

44 | P a g e
• Thermal Infra-Red (TIR)
Near Infra-Red (NIR)
This wavelength region is 0.7 to 1.3 μm and interactions of this region with
matter vary with wavelength.
Mid Infra-Red (MIR)
This region’s wavelength range is 1.3 to 3.0 μm. These reflected solar
radiation contains no information about the thermal properties of material.
The band from 0.7 to 0.9 μm. Is detectable with film and is called
photographic IR band. It can be detected using Electro-optical sensor.
Thermal Infra-Red (TIR)
TIR region is grouped in the wavelength region 3 to 5 μm and 8 to14 μm.
These are principal atmospheric windows in the thermal regions. The
images at these wavelengths can be acquired by optical mechanical
scanners and special vidicon systems. The images cannot be detected
using films.
Microwave region:
The wavelength range of this region falls from 1mm to 1m. the microwave
are further divided into different wavelength bands. These are the longer
wavelength regions and can penetrate clouds, fog, and rain. The images
can be acquired in either passive or active mode. Radar is one of the active
forms of microwave remote sensing.

Table: 5.2 Band and Wavelength

BAND WAVELENGTH

P 30-100cm

L 15-30cm

S 7.5-15cm

C 3.8-7.5cm

X 2.4-3.8cm

Ku 1.7-3.8cm

K 1.1-1.7cm

Ka 0.75-1.1cm

45 | P a g e
Radio Waves region:
The wavelength of this region falls from 10 cm to 100 cm. This region is the
longest wavelength portion of electromagnetic spectrum.
5.1.3 Spectral Reflectance Patterns of Visible and Non-Visible
Regions

Radio: This is the same kind of


energy that radio stations emit
into the air for your boom box to
capture and turn into your favorite
tunes. But radio waves are also
emitted by other things such as
stars and gases in space.

Microwaves: Microwaves are


used by astronomers to learn
about the structure of nearby
galaxies, including our own Milky
Way.

Infrared: It is the same thing as


'heat’ because it makes our skin
feel warm. In space, IR light maps
the dust between stars.
Visible: This is the part that our
eyes see. Visible radiation is
emitted by everything from
fireflies to light bulbs to stars and
by fast- moving particles hitting
other particles.
Ultraviolet: The Sun is a source of
ultraviolet (or UV) radiation
because it is the UV rays that
cause our skin to burn. Stars and
other "hot" objects in space emit
UV radiation.

46 | P a g e
X-rays: Hot gases in the Universe
also emit X-rays.

Gamma-rays: Radioactive
materials (some natural and
others made by man in things like
nuclear power plants) can emit
gamma-rays. Big particle
accelerators that scientists use to
help them understand what
matter is made of can sometimes
generate gamma-rays. But the
biggest gamma-ray generator of
all is the Universe. It makes
gamma radiation in all kinds of
ways.

5.2 Atmospheric Windows


The places where energy passes through are called "atmospheric
windows". We use these "windows" in remote sensing to peer into the
atmosphere from which we can obtain much information concerning the
weather. Most of the sun's energy comes from visible light and the near
infrared portion of the electromagnetic spectrum. A range of wavelengths
over which there is relatively little absorption of radiation by atmospheric
gases. The major windows are the visible window, from 0.3 to 0.9 μm; the
infrared window, from 8 to 13 μm; and the microwave window, at
wavelengths longer than 1 mm. In the Earth's atmosphere this window
is roughly the region between 8 and 14 μm although it can be narrowed or
closed at times and places of high humidity because of the strong
absorption in the water vapour continuum or because of blocking by clouds.
An atmospheric window is a range of wavelengths of the electromagnetic
spectrum that can pass through the earth's atmosphere.
The first section in JetStream, The Atmosphere, provided information about
the Earth-Atmosphere energy balance. That section refers to the total
combined energy received from the sun and emitted by the earth and
atmosphere. However, not all wavelengths of electromagnetic radiation
from the sun reach the earth and not all wavelengths emitted by the earth
reach into space. The atmosphere absorbs some of this energy while

47 | P a g e
allowing other wavelengths to pass through. The places where energy
passes through are called "atmospheric windows". We use these "windows"
in remote sensing to peer into the atmosphere from which we can obtain
much information concerning the weather. Most of the sun's energy comes
from visible light and the near infrared portion of the electromagnetic
spectrum. All the outgoing energy emitted by the earth is infrared.

Fig. 5.3 Atmospheric Windows

Let us Sum Up
The electromagnetic spectrum is the range of frequencies of
electromagnetic radiation and their respective wavelengths and photon
energies. The electromagnetic spectrum covers electromagnetic waves
with frequencies ranging from below one hertz to above 1025 hertz,
corresponding to wavelengths from thousands of kilometers down to a
fraction of the size of an atomic nucleus. This frequency range is divided
into separate bands, and the electromagnetic waves within each frequency
band are called by different names; beginning at the low frequency end of
the spectrum these are: radio waves, microwaves, infrared, visible light,
ultraviolet, X-rays, and gamma rays at the high-frequency end.

48 | P a g e
Glossary
Gamma-ray: The wavelength in this region is less than 0.03 nm.
X-ray: This region has a wavelength range of 0.03 to 3.0 nm. The
atmosphere absorbs these rays as well.
Ultraviolet: The wavelength extends from 0.03 to 0.4 µm in the ultraviolet
region. Ozone in the upper atmosphere entirely absorbs incoming
wavelengths less than 0.3µm. It causes fluorescence and it has applications
in geology and vegetation.
Microwave: This region has wavelengths ranging from 1mm to 1m. These
are the regions with longer wavelengths that can penetrate clouds, fog, and
rain. One of the active forms of microwave remote sensing is radar.
Radio wave: This region's wavelength ranges from 10 cm to 100 km. This
is the portion of the electromagnetic spectrum with the longest wavelengths.
Some classified radars with very long wavelengths operate in this region.

Check Your Progress


1. Define: Photon Energies
The electromagnetic spectrum is the range of frequencies of
electromagnetic radiation and their respective wavelengths and photon
energies.
2. Define: Spectroscopy

Spectroscopy can be used to separate waves of different frequencies,


producing a spectrum of the constituent frequencies.
3. Define: Electromagnetic waves
This frequency range is divided into separate bands, and the
electromagnetic waves within each frequency band are called by different
names; beginning at the low frequency end of the spectrum these are: radio
waves, microwaves, infrared, visible light, ultraviolet, X-rays, and gamma
rays at the high-frequency end.

Suggested Readings
1. Campbell J., 1989. Introduction to Remote Sensing, Guilford, New
York.
2. Thomas M., Lillesand, Ralph, Kefer, W., 1994. Remote Sensing and
Image Interpretation, John Wiley & Sons, New York.

3. Campbell J., 1989. Introduction to Remote Sensing, Guilford, New

49 | P a g e
York.
4. Thomas M., Lillesand, Ralph, Kefer, W., 1994. Remote Sensing and
Image Interpretation, John Wiley & Sons, New York.
5. Sarkar, A. (2015): Practical Geography: A Systematic Approach.
Orient Black Swan Private Ltd., New Delhi.
6. Anji Reddy, M. (2008): Textbook of Remote Sensing and Geographic
Information System, B.S. Publication, Hyderabad.
7. Campbell, J. B., (2007): Introduction to Remote Sensing, Guildford
Press.

8. Chauniyal, D.D., (2010): Sudur Samvedanevam Bhogolik Suchana


Pranali (Hindi), Sharda Pustak Bhawan, Allahabad. 34.

50 | P a g e
Unit 6
Energy Interaction with Atmosphere and
The Earth
Structure
Overview
Learning Objectives

6.1 Interaction of EMR with Atmosphere


6.1.1 Rayleigh Scattering
6.1.2 Mie Scattering

6.1.3 Nonselective Scattering


6.2 Interaction of EMR with Earth’s Surface Features
6.2.1 Reflection
6.2.2 Transmission
6.2.3 Spectral Signature
6.3 Reflectance Characteristics of Earth’s Cover Types
Let us Sum Up
Glossary
Check Your Progress
Suggested Readings

Overview
Before radiation used for remote sensing reaches the Earth's surface it has
to travel through some distance of the Earth's atmosphere. Particles and
gases in the atmosphere can affect the incoming light and radiation. These
effects are caused by the mechanisms of scattering and absorption.
Scattering occurs when particles or large gas molecules present in the
atmosphere interact with and cause the electromagnetic radiation to be
redirected from its original path. How much scattering takes place depends
on several factors including the wavelength of the radiation, the abundance
of particles or gases, and the distance the radiation travels through the
atmosphere. The part of the radiation field that has made it through the

51 | P a g e
atmosphere without being absorbed or scattered back toward space now
reaches the Earth’s surface.
Learning Objectives
After Learning this lesson, you will be able to:
• Know the Interaction of EMR with Atmosphere
• Understand the Interaction of EMR with Earth’s Surface Features

6.1 Interaction of EMR with Atmosphere


There are three types of scattering which take place.
• Rayleigh Scattering
• Mie Scattering

• Nonselective Scattering
6.1.1 Rayleigh Scattering
Rayleigh scattering occurs when particles are very small compared to the
wavelength of the radiation. These could be particles such as small specks
of dust or nitrogen and oxygen molecules. Rayleigh scattering causes
shorter wavelengths of energy to be scattered much more than longer
wavelengths. Rayleigh scattering is the dominant scattering mechanism in
the upper atmosphere. The fact that the sky appears "blue" during the day
is because of this phenomenon. As sunlight passes through the
atmosphere, the shorter wavelengths (i.e., blue) of the visible spectrum are
scattered more than the other (longer) visible wavelengths. At sunrise and
sunset, the light has to travel farther through the atmosphere than at midday
and the scattering of the shorter wavelengths is more complete; this leaves
a greater proportion of the longer wavelengths to penetrate the atmosphere.

Fig. 6.1 Rayleigh Scattering

52 | P a g e
6.1.2 Mie Scattering
Mie scattering occurs when the particles are just about the same size as the
wavelength of the radiation. Dust, pollen, smoke, and water vapour are
common causes of Mie scattering which tends to affect longer wavelengths
than those affected by Rayleigh scattering. Mie scattering occurs mostly in
the lower portions of the atmosphere where larger particles are more
abundant and dominates when cloud conditions are overcast.

Fig. 6.2 Mie Scattering


6.1.3 Nonselective Scattering
The final scattering mechanism is called nonselective scattering. This
occurs when the particles are much larger than the wavelength of the
radiation. Water droplets and large dust particles can cause this type of
scattering. Nonselective scattering gets its name from the fact that all
wavelengths are scattered about equally. This type of scattering causes fog
and clouds to appear white to our eyes because blue, green, and red light
are all scattered in approximately equal quantities.

Fig. 6.3 Nonselective Scattering

53 | P a g e
6.2 Interaction of EMR with Earth’s Surface Features
Absorption is the other main mechanism at work when electromagnetic
radiation interacts with the atmosphere. In contrast to scattering, this
phenomenon causes molecules in the atmosphere to absorb energy at
various wavelengths. Ozone, carbon dioxide, and water vapour are the
three main atmospheric constituents which absorb radiation. Ozone serves
to absorb the harmful ultraviolet radiation from the sun. Without this
protective layer in the atmosphere our skin would burn when exposed to
sunlight. You may have heard carbon dioxide referred to as a greenhouse
gas. This is because it tends to absorb radiation strongly in the far infrared
portion of the spectrum - that area associated with thermal heating - which
serves to trap this heat inside the atmosphere. Water vapour in the
atmosphere absorbs much of the incoming long wave infrared and
shortwave microwave radiation. The presence of water vapour in the lower
atmosphere varies greatly from location to location and at different times of
the year. The air mass above a desert would have very little water vapour
to absorb energy, while the tropics would have high concentrations of water
vapour. Radiation that is not absorbed or scattered in the atmosphere can
reach and interact with the Earth's surface. There are three forms of
interaction that can take place when energy strikes or is incident upon the
surface. These are: absorption (A); transmission (T); and reflection (R).

Fig. 6.4 Interaction of EMR with Earth’s Surface Features and Atmosphere

54 | P a g e
Radiation from the sun, when incident upon the earth’s surface, is either
reflected by the surface, transmitted into the surface or absorbed and
emitted by the surface. The EMR, on interaction, experiences a number of
changes in magnitude, direction, wavelength, polarization and phase.
These changes are detected by the remote sensor and enable the
interpreter to obtain useful information about the object of interest. The
remotely sensed data contain both spatial information (size, shape and
orientation) and spectral information (tone, colour and spectral signature).
In the microwave region of the spectrum, the sensor is radar, which is an
active sensor, as it provides its own source of EMR. The EMR produced by
the radar is transmitted to the earth’s surface and the EMR reflected (back
scattered) from the surface is recorded and analyzed. The microwave
region can also be monitored with passive sensors, called microwave
radiometers, which record the radiation emitted by the terrain in the
microwave region.
6.2.1 Reflection
Of all the interactions in the reflective region, surface reflections are the
most useful and revealing in remote sensing applications. Reflection occurs
when a ray of light is redirected as it strikes a non-transparent surface. The
reflection intensity depends on the surface refractive index, absorption
coefficient and the angles of incidence and reflection.

Fig. 6.5 Different types of scattering surfaces (a) Perfect specular reflector
(b) Near perfect specular reflector (c) Lambertain (d) Quasi-Lambertian
(e) Complex

55 | P a g e
6.2.2 Transmission
Transmission of radiation occurs when radiation passes through a
substance without significant attenuation. For a given thickness, or depth of
a substance, the ability of a medium to transmit energy is measured as
transmittance.
6.2.3 Spectral Signature
Spectral reflectance is the ratio of reflected energy to incident energy as a
function of wavelength. Various materials of the earth’s surface have
different spectral reflectance characteristics. Spectral reflectance is
responsible for the colour or tone in a photographic image of an object.
Trees appear green because they reflect more of the green wavelength.
The values of the spectral reflectance of objects averaged over different,
well-defined wavelength intervals comprise the spectral signature of the
objects or features by which they can be distinguished.

6.3 Reflectance Characteristics of Earth’s Cover Types


The spectral characteristics of the three main earth surface features are
discussed below:
Vegetation: The spectral characteristics of vegetation vary with
wavelength. Plant pigment in leaves called chlorophyll strongly absorbs
radiation in the red and blue wavelengths but reflects green wavelength.
The internal structure of healthy leaves acts as diffuse reflector of near
infrared wavelengths. Measuring and monitoring the near infrared
reflectance is one way that scientists determine how healthy vegetation may
be.
Water: Majority of the radiation incident upon water is not reflected but is
either absorbed or transmitted. Longer visible wavelengths and near
infrared radiation is absorbed more by water than by the visible
wavelengths. Thus, water looks blue or blue green due to stronger
reflectance at these shorter wavelengths and darker if viewed at red or near
infrared wavelengths. The factors that affect the variability in reflectance of
a water body are depth of water, materials within water and surface
roughness of water.
Soil: The majority of radiation incidents on a soil surface is either reflected
or absorbed and little is transmitted. The characteristics of soil that
determine its reflectance properties are its moisture content, organic matter
content, texture, structure and iron oxide content. The presence of moisture
in soil decreases its reflectance.

56 | P a g e
Fig. 6.6 Interaction of EMR with Earth’s Surface

Let us Sum Up
The interaction between electromagnetic radiation and the Earth’s
atmosphere can be considered to have three components: refraction that
changes the direction of propagation of the radiation field due to density
differences between outer space and the atmosphere, scattering that
changes the direction of propagation of individual photons as they are
absorbed and re-emitted by gasses or aerosols or other atmospheric
constituents without changing wavelength, and absorption that convert
photons into vibrations in a molecule, energy which is re-emitted as one or
more photons with longer wavelength(s). The probability of reflection rather
than absorption happening is termed the reflectance of the surface, and it
depends on the material on the surface as well as the wavelength of the
incoming radiation. Each surface material has a unique ‘signature’ that
defines what proportion of radiation is reflected for each wavelength. For
example, water reflects a small amount of blue and green wavelengths, less
of the red wavelengths, and almost nothing in the infrared wavelengths.
Vegetation, on the other hand, reflected around half of all incoming infrared
radiation, except for specific wavelengths that are effectively absorbed by
liquid water in the leaves. These spectral signatures are commonly
portrayed as graphs, with wavelengths along the x-axis and reflectance
along the y-axis.

57 | P a g e
Glossary
Rayleigh scattering: Rayleigh scattering occurs when particles are very
small compared to the wavelength of the radiation. These could be particles
such as small specks of dust or nitrogen and oxygen molecules. Rayleigh
scattering causes shorter wavelengths of energy to be scattered much more
than longer wavelengths.
Mie scattering: Mie scattering occurs when the particles are just about the
same size as the wavelength of the radiation. Dust, pollen, smoke and water
vapour are common causes of Mie scattering which tends to affect longer
wavelengths than those affected by Rayleigh scattering. Mie scattering
occurs mostly in the lower portions of the atmosphere where larger particles
are more abundant and dominates when cloud conditions are overcast.
Nonselective scattering: This occurs when the particles are much larger
than the wavelength of the radiation. Water droplets and large dust particles
can cause this type of scattering. Nonselective scattering gets its name from
the fact that all wavelengths are scattered about equally.

Check Your Progress


1. What are the types of scattering when energy interacts with the
atmosphere?
Rayleigh’s scattering, Mie scattering and non-selective scattering.
2. What is refraction?
Refraction is the bending of the direction of propagation of
electromagnetic radiation as it moves between two media with different
densities.
3. Which scattering causes blue sky?
Rayleigh’s scattering

Suggested Readings
1. Campbell J., 1989. Introduction to Remote Sensing, Guilford, New
York.
2. Thomas M., Lillesand, Ralph, Kefer, W., 1994. Remote Sensing and
Image Interpretation, John Wiley & Sons, New York.
3. Campbell J., 1989. Introduction to Remote Sensing, Guilford, New
York.

4. Thomas M., Lillesand, Ralph, Kefer, W., 1994. Remote Sensing and
Image Interpretation, John Wiley & Sons, New York.

58 | P a g e
5. Sarkar, A. (2015): Practical Geography: A Systematic Approach.
Orient Black Swan Private Ltd., New Delhi.
6. Anji Reddy, M. (2008): Textbook of Remote Sensing and Geographic
Information System, B.S. Publication, Hyderabad.
7. Campbell, J. B., (2007): Introduction to Remote Sensing, Guildford
Press.

8. Chauniyal, D.D., (2010): Sudur Samvedanevam Bhogolik Suchana


Pranali (Hindi), Sharda Pustak Bhawan, Allahabad. 34.

59 | P a g e
BLOCK 3

Types of Remote Sensing and Scanners


Unit: 7 Platforms, Types of Platforms and its Characteristics

Unit: 8 Active and Passive, Optical-Mechanical Scanners and Push-


Broom Scanners

Unit: 9 Thermal Remote Sensing and Ideal Remote Sensing


Systems

60 | P a g e
Unit 7
Platforms, Types of Platforms, and
its Characteristics
Structure
Overview
Learning Objectives

7.1 Remote Sensing Platforms


7.1.1 Ground Based Platforms
7.1.2 Balloon Platforms

7.1.3 Aircraft Platform


7.1.4 Rockets as Platforms
7.1.5 Spacecraft as Platform
7.2 Sensors
7.2.1 Active Sensors
7.2.2 Passive Sensors
7.3 Different Types of Sensors and their Characteristics
7.4 Scanning Mode
7.4.1 Across-Track Scanners Sensors and Platforms:
7.4.2 Across Track and Along Track Scanning System
7.4.3 Side Looking or Oblique Scanning Systems (Radar):
7.5 Non-Scanning Mode
Let us Sum Up
Glossary
Check Your Progress
Suggested Readings
Web Sources

61 | P a g e
Overview
Remote sensing platforms can be defined as the structures or vehicles on
which remote sensing instruments (sensors) are mounted. For remote
sensing applications, sensors should be mounted on suitable stable
platforms. These platforms can be ground based air borne or space borne
based. As the platform height increases the spatial resolution and
observational area increases. Thus, the higher the sensor is mounted; the
larger the spatial resolution and synoptic view is obtained. The types or
characteristics of platform depend on the type of sensor to be attached and
its application. Platforms for remote sensors may be situated on the ground,
on an aircraft or balloon (or some other platform within the Earth's
atmosphere), or on a spacecraft or satellite outside of the Earth's
atmosphere. Typical platforms are satellites and aircraft, but they can also
include radio-controlled aeroplanes, balloons kites for low altitude remote
sensing, as well as ladder trucks or 'cherry pickers' for ground
investigations.
Learning Objectives
After Learning this lesson, you will be able to:
• Understand the Remote Sensing Platforms

• Acquire the Knowledge of Remote Sensing Sensors

7.1 Remote Sensing Platforms


Platform is a stage where sensor or camera is mounted to acquire
information about a target under investigation. According to Lillesand and
Kiefer (2000), a platform is a vehicle, from which a sensor can be operated.
Platforms can vary from stepladders to satellites. There are different types
of platforms and based on its altitude above earth surface, these may be
classified as:
7.1.1 Ground Based Platforms
Wide varieties of ground-based platforms are used in remote sensing. Some
of the common ones are handheld devices, tripods, towers and cranes. To
study properties of a single plant or a small patch of grass, ground based
platform is used. Ground based platforms are also used for sensor
calibration, quality control and for the development of new sensors. For the
field investigations, some of the most popular platforms have been used are
‘cherry picker platform, portable masts and towers. The cherry picker
platforms can be extended to approx. 15m. They have been used by various
laboratories to carry spectral reflectance meters and photographic systems.

62 | P a g e
Portable masts are also available in various forms and can be used to
support cameras and sensors for testing. The main problem with these
masts is that of stabilizing the platform, particularly in windy conditions.
Permanent ground platforms like towers and cranes are used for monitoring
atmospheric phenomenon and long-term monitoring of terrestrial features.
Towers can be built on site and can be tall enough to project through a forest
canopy so that a range of measurements can be taken from the forest floor,
through the canopy and from above the canopy.

Fig. 7.1 Crane, Ground Based Platform

7.1.2 Balloon Platforms


Balloons as platforms are not very expensive like aircraft. They have a great
variety of shapes, has set up a national balloon facility at Hyderabad. The
balloons have low acceleration, require no power, and exhibit low vibrations.
There are three main types of balloon systems, viz. free balloons, Tethered
balloons, and Powered Balloons. Free balloons can reach almost the top of
the atmosphere; hence, they can provide a platform at intermediate altitude
between those of aircraft and spacecraft. Free floating or anchored balloons
have an altitude range of 22-40 km and can be used to a limited extent as
a platform. In India, at present, Tata Institute of Fundamental Research,
Mumbai, connected to the earth station by means of wires having high
tensional strength and high flexibility.

63 | P a g e
Fig. 7.2 Balloon as platform
7.1.3 Aircraft Platform
Aerial platforms are primarily stable wing aircraft. Helicopters are also
occasionally used for this purpose. Generally, aircraft are used to collect
very detailed images. Helicopters can be for pinpointing locations, but they
vibrate and lacks stability.

a. Low Altitude Aircraft


It is most widely used and generally operates below 30,000 ft. They have
single engine or light twin engine. It is suitable for obtaining image data for
small areas having large scale.

Fig. 7.3. Low Altitude Aircraft

64 | P a g e
b. High Altitude Aircraft
It is more stable and operates above 30,000 ft. High altitude aircraft includes
jet aircraft with good rate of climb, maximum speed, and high operating
ceiling. It acquires imagery for large areas (smaller scale). Examples are
NHAP, NAPP, AVIRIS. Aircraft platforms acquire imagery under suitable
weather conditions. It controls platform variables such as altitude. Time of
coverage can also be controlled.

Fig. 7.4 Image by High Altitude Aircraft


7.1.4 Rockets as Platforms
Prior to use of airplanes, aerial photographs were obtained by rocketing a
camera into the sky and then retrieving the camera and film. Synoptic
imagery can be obtained from rockets for areas of some 500,000 square
km. The Skylark earth Resource Rocket is fired from a mobile launcher to
altitudes between 90 - 400 kms. With the help of a parachute, the payload
and the spent motor are returned to the ground gently thereby enabling
speedy recovery of the photographic records. This rocket system has been
used in surveys over Australia and Argentina. In 1946, V-2 rockets acquired
from Germany after World War II were launched to high altitudes from White
Sands, New Mexico. These rockets contained automated still or movie
cameras that took pictures as the vehicle ascended. The main problem with
rockets is that they are one-time observations only. Except for one time
qualitative or reconnaissance purposes, rocket platforms are not of much
use in regular operational systems.

65 | P a g e
Fig. 7.5 Rocket as Platform
7.1.5 Spacecraft as Platform
Remote sensing is also conducted from the space shuttle or artificial
satellites. Artificial satellites are manmade objects, which revolve around
another object. The 1960s saw the primary platform used to carry remotely
sensed instruments shifted from airplanes to satellite. Satellites can cover
much more land space than planes and can monitor areas on a regular
basis. Beginning with the first television and infrared observation Satellite in
1960, early weather satellites returned rather poor views of cloud patterns
and almost indistinct images of the earth’s surface.

Fig. 7.6 Spacecraft as Platform

66 | P a g e
Space photography has become better and was further extended with the
Apollo program. Then in 1973, Skylab the first American space workshop
was launched, and its astronauts took over 35,000 images of the earth with
the earth Resources experiment Package on board. Later on, with
LANADSAT and SPOT Satellite program, space photography received a
higher impetus.

7.2 Sensors
A sensor is a device that gathers energy (EMR or other), converts it into a
signal and presents it in a form suitable for obtaining information about the
target under investigation. According to Jensen (2000), remote sensors are
mechanical devices, which collect information, usually in storable form,
about objects or scenes, while being at some distance from them. Sensors
used for remote sensing can be either those operating in Optical Infrared
(OIR) region or those operating in the microwave region. Depending on
the source of energy, sensors are categorized as active or passive:
7.2.1 Active Sensors
Active sensors are those, which have their own source of EMR for
illuminating objects. Radar (Radio Detection and Ranging) and Lidar (Light
Detection and Ranging) are some examples of active sensors. A
photographic camera becomes an active sensor when used with a flash
bulb. Radar is composed of a transmitter and a receiver. The transmitter
emits a wave, which hits objects in the environment and gets reflected or
echoed back to the receiver. The main advantage is that active sensors can
obtain imagery in wavebands where natural signal levels are extremely low
and also are independent of natural illumination. The major disadvantage
with active sensor is that it needs high energy levels, therefore adequate
inputs of power is necessary.
7.2.2 Passive Sensors
Passive sensors do not have their own source of energy. These sensors
receive solar electromagnetic energy reflected from the surface or energy
emitted by the surface itself. Therefore, except for thermal sensors they
cannot be used at nighttime. Thus, in passive sensing, there is no control
over the source of electromagnetic radiation. Photographic cameras
(without the use of bulb), multispectral scanners, vidicon cameras etc. are
examples of passive remote sensors. The advantage with passive sensor
is that it is simple and does not require high power. The disadvantage is that
during bad weather conditions the passive sensors do not work. The

67 | P a g e
Thematic Mapper (TM) sensor system on the Landsat satellite is a passive
sensor.

7.3 Different Types of Sensors and their


Characteristics
7.3.1 Optical-Infrared Sensors
Optical infrared remote sensors are used to record reflected/emitted
radiation of visible, near middle and far infrared regions of electromagnetic
radiation. They can observe wavelengths extended from 400-2000 nm. Sun
is the source of optical remote sensing. There are two kinds of observation
methods using optical sensors: visible/near infrared remote sensing and
thermal infrared remote sensing.
7.3.2 Visible/Near Infrared Remote Sensing
In this, visible light and near infrared rays of sunlight reflected by objects on
the ground are observed. The magnitude of reflection infers the conditions
of land surface, e.g., plant species and their distribution, rivers, lakes, urban
areas etc. In the absence of sunlight or darkness, this method cannot be
used.
7.3.3 Panchromatic Imaging System
In this type of sensor, radiation is detected within a broad wavelength range.
In panchromatic band, visible and near infrared are included. The imagery
appears as a black and white photograph. Examples of panchromatic
imaging systems are Landsat ETM+ PAN, SPOT HRV-PAN and IKONOS
PAN, IRS-1C, IRS-1D and CARTOSAT-series. Spectral range of
Panchromatic band of ETM+ is 0.52 µm to 0.9 µm, CARTOSAT-2B is 0.45-
0.85 µm, SPOT is 0.45- 0.745 µm.
7.3.4 Multispectral imaging System
The multispectral imaging system uses multichannel detectors and records
radiation within a narrow range of wavelengths. Both brightness and color
information are available on the image. LANDSAT, LANDSAT TM, SPOT
HRV-XS and LISS etc. are examples.
7.3.5 Thermal Infrared Remote Sensing
In thermal infrared remote sensing, sensors acquire those energy/ heat that
are radiated by earth surface due to interaction with solar radiation. This is
also used to observe high temperature areas, such as volcanic activities
and forest fires. Based on the strength of radiation, one can surface

68 | P a g e
temperatures of land and sea, and status of volcanic activities and forest
fires.
7.3.6 Hyper Spectral Imaging System
Hyper Spectral imaging system records the radiation of terrain in 100s of
narrow spectral bands. Therefore, the spectral signature of an object can
be achieved accurately, helps in object identification more precisely.
Example, Hyperion data is recorded in 242 spectral bands and AVIRIS data
is recorded in 224 spectral bands.
7.3.7 Microwave Sensors
These types of sensors receive microwaves, which have longer
wavelengths than visible light and infrared rays. The observation is not
affected by day, night or weather. The microwave portion of the spectrum
includes wavelengths within the approximate range of 1 mm to1m. The
longest microwaves are about 2,500,000 times longer than the shortest light
waves. There are two types of observation methods using microwave
sensor: a) Active sensor- The sensor emits microwaves and observes
microwaves reflected by land surface features. It is used to observe
mountains, valleys, surface of oceans wind, wave and ice conditions and b)
Passive sensor- This type of sensor records microwaves that naturally
radiated from earth surface features.

7.4 Scanning Mode


A scanning system is formed when the scene is scanned point by point
along successive lines over a finite time. In a series of parallel scan lines,
multispectral scanning systems sweep the detector's IFOV across the
terrain. For the scanning system, common types of scanning modes:

1. Across-track scanning
2. Along-track scanning system
3. Side looking or oblique scanning system (Radar)
7.4.1 Across-Track Scanners Sensors and Platforms:
This scanning system makes use of a faceted mirror that is rotated by an
electric motor with the horizontal axis of rotation parallel to the direction
of flight. The mirror scans the landscape in a pattern of parallel scan lines
at right angles to the path of the airborne platform. The mirrors transmit
energy that are reflected or radiated from the ground onto the detector. This
type of scanner is also known as a whisk broom scanner system. The energy
flux, sensor altitude, detector spectral bandwidth, IFOV, and dwell time are

69 | P a g e
all aspects that affect the strength of the sensor signal produced by a
detector since across-track scanners have a short dwell time the detector
receives less energy and generate a weak signal. The Multispectral
Scanner (MSS) and Thematic Mapper (TM) of the Landsat series of
satellites are examples of across-track scanners.

7.4.2 Across Track and Along Track Scanning System


Along-Track Scanning System: Along-track scanners collect multiband
image data along a swath beneath the aircraft. As the aircraft moves ahead,
the scanner scans the Earth with respect to the planned swath to create a
two-dimensional image by recording successive scan lines that are aligned
at right angles to the aircraft's path. In this system, detectors are arranged
in a linear array in the focal plane of an image formed by a lens system. This
system is also known as a push broom system since the array of detectors
record landscape brightness along a line of pixels are effectively moved
along the aircraft's orbital path like a broom. Along track scanners have a
long dwell time and detectors with excellent spatial and spectral resolutions.
The contrast between an along- track system and an across-track system
is that with a push broom sensor, linear arrays of detectors are utilized
instead of a revolving mirror. SPOT- High Resolution Visible (HRV) camera
and Linear Imaging Self Scanning
7.4.3 Side Looking or Oblique Scanning Systems (Radar):
A scanning system that looks to the side is an active scanning system, such
as radar. This system creates EMR, then exposes the terrain and detects
the energy (radar pulses) returning from the terrain, which is then recorded
as an image. Radar imagery is thus obtained by collecting and analyzing

70 | P a g e
the reflections of pulses carried out by radar equipped aircraft and satellites.
SLAR (Side Looking Airborne Radar) is a common kind of remote sensing
technique used to obtain radar images of the terrain. SLAR's primary
components include an antenna, a duplexer, a transmitter, a receiver, a
pulse-generating device, and a cathode ray tube.

7.5 Non-Scanning Mode


It is referred to as a non-scanning system when the complete scene is
sensed directly with the sensor. Sensors can be classified into,
• Imaging Sensors
• Non-imaging Sensors

7.5.1 Imaging Sensors:


System for generating a two-dimensional image of radiation intensity by
monitoring radiation intensity as a function of position on the earth's surface.
Consider a camera or a scanner, for example. Imaging sensors include
optical imaging sensors, thermal IR imaging sensors, and radar imaging
sensors.

Optical imaging sensor: Optical imaging sensors use the visible and
reflected IR bands. Optical imaging systems utilised on space platforms
include panchromatic, multispectral, and hyper spectral systems.
Thermal IR imaging sensor: A thermal sensor operates in the
electromagnetic spectrum between 9 and 14 μm, generally in the mid-to-
far-infrared and microwave ranges. By emitting infrared radiation, any object
with a temperature over zero can create a thermal image.
Radar imaging sensor: A radar (microwave) imaging sensor is often an
active sensor that operates in the electromagnetic spectrum between 1 mm
-1 m. The sensor sends light to the ground, and the target reflects the energy
back to the radar antenna, creating a microwave image. The radar follows
a flight path, and the radar's lighted area, or footprint, travels across the
surface in a swath.
7.5 2 non-imaging sensors:
A profile recorder is a non-imaging sensor that measures a signal
depending on the intensity of the full field of vision. This type of sensor does
not store information about how the input varies over time. Non-imaging
sensors used in remote sensing include radiometers, altimeters,
spectrometers, spectro radiometers, and LIDAR.

71 | P a g e
Radiometer: A radiometer is any piece of equipment that quantitatively
measures electromagnetic radiation in a specific range of the
electromagnetic spectrum.
Spectrometer: A spectrometer is a sensor with a component, such as a
prism or diffraction grating, that may break a portion of the spectrum into
discrete wavelengths and scatter (or separate) them at different angles to
an array of detectors.
Spectro-radiometer: Spectro-radiometers are sensors that gather diffused
radiation in bands rather than specific wavelengths. The most popular
air/space sensors are spectro radiometers.

Let us Sum Up
Platform is the vehicle or carrier for remote sensors, from which a sensor
can be operated. Weather Surveillance Radar is of the long-range type
which detects and tracks typhoons and cloud masses at distance of 400
kilometres or less. The radar is a useful tool in tracking and monitoring
tropical cyclones. Active Sensors are the sensors that detect reflected
responses from objects which are irradiated from artificially generated
energy sources. A spectrometer is a sensor with a component, such as a
prism or diffraction grating, that may break a portion of the spectrum into
discrete wavelengths and scatter (or separate) them at different angles to
an array of detectors.

Glossary
The GPS (Global Positioning System) is a satellite-based navigation system
with at least 24 satellites. Anywhere in the globe, GPS works in any weather
condition.
A rover is a planetary surface exploration device designed to move on the
solid surface of a planet or other celestial body of planetary mass.

Check Your Progress


1. What are the types of platforms?
Ground based, Air borne, and Space borne platforms.

2. What is Space shuttle?


The space shuttle, also known as the Space Transportation System, is
a partially reusable rocket-launched vehicle.
3. What is a radiometer?
A radiometer is any piece of equipment that quantitatively measures

72 | P a g e
electromagnetic radiation in a specific range of the electromagnetic
spectrum.

Suggested Readings
1. Campbell J., 1989. Introduction to Remote Sensing, Guilford, New
York.
2. Thomas M., Lillesand, Ralph, Kefer, W., 1994. Remote Sensing and
Image Interpretation, John Wiley & Sons, New York.

3. Campbell J., 1989. Introduction to Remote Sensing, Guilford, New


York.
4. Thomas M., Lillesand, Ralph, Kefer, W., 1994. Remote Sensing and
Image Interpretation, John Wiley & Sons, New York.
5. Sarkar, A. (2015): Practical Geography: A Systematic Approach.
Orient Black Swan Private Ltd., New Delhi.

6. Anji Reddy, M. (2008): Textbook of Remote Sensing and Geographic


Information System, B.S. Publication, Hyderabad.
7. Campbell, J. B., (2007): Introduction to Remote Sensing, Guildford
Press.
8. Chauniyal, D.D., (2010): Sudur Samvedanevam Bhogolik Suchana
Pranali (Hindi), Sharda Pustak Bhawan, Allahabad. 34.

Web Sources
1. http://www.drbrambedkarcollege.ac.in/sites/default/files/Remote%20
sensing%20platforms.pdf
2. https://eos.com/blog/types-of-remote-sensing/

73 | P a g e
Unit 8
Active and Passive, Optical-Mechanical
Scanners and Push-Broom Scanners
Structure
Overview
Learning Objectives

8.1 Active and Passive Sensors


8.1.1 Active Sensors
8.1.2 Passive Sensors

8.2 Optical-Mechanical Scanners


8.3 Push-Broom Scanners
Let us Sum Up
Glossary
Check Your Progress
Suggested Readings
Web Sources

Overview
This scanning system makes use of a faceted mirror that is rotated by an
electric motor with the horizontal axis of rotation parallel to the direction of
flight. The mirror scans the landscape in a pattern of parallel scan lines at
right angles to the path of the airborne platform. The mirrors transmit energy
that are reflected or radiated from the ground onto the detector. This type of
scanner is also known as a whisk broom scanner system. The energy flux,
sensor altitude, detector spectral bandwidth, IFOV, and dwell time are all
aspects that affect the strength of the sensor signal produced by a detector
since across-track scanners have a short dwell time the detector receives
less energy and generate a weak signal.
Learning Objectives
After Learning this lesson, you will be able to:
• Understand the Active and Passive Scanner

74 | P a g e
• Acquire the Knowledge of Optical-Mechanical Scanners and Push-
Broom Scanners

8.1 Active and Passive Sensors


8.1.1 Active Sensors
Active sensors are those, which have their own source of EMR for
illuminating the objects. Radar (Radio Detection and Ranging) and Lidar
(Light Detection and Ranging) are some examples of active sensor.
Photographic camera becomes an active sensor when used with a flash
bulb. Radar is composed of a transmitter and a receiver. The transmitter
emits a wave, which hits objects in the environment and gets reflected or
echoed back to the receiver. The main advantage is that active sensors can
obtain imagery in wavebands where natural signal levels are extremely low
and are independent of natural illumination. The major disadvantage with
active sensor is that it needs high energy levels, therefore adequate inputs
of power is necessary.
8.1.2 Passive Sensors
Passive sensors do not have their own source of energy. These sensors
receive solar electromagnetic energy reflected from the surface or energy
emitted by the surface itself. Therefore, except for thermal sensors they
cannot be used at nighttime. Thus, in passive sensing, there is no control
over the resource of electromagnetic radiation. Photographic cameras
(without the use of bulb), multispectral scanners vidicon cameras etc. are
examples of passive remote sensors. The advantage with passive sensor
is that it is simple and do not require high power. The disadvantage is that
during bad weather conditions the passive sensors do not work. The
Thematic Mapper (TM) sensor system on the Landsat satellite is a passive
sensor.

8.2 Optical-Mechanical Scanners


An optical mechanical scanner is a multispectral radiometer by which two-
dimensional imagery can be recorded using a combination of the motion of
the platform and a rotating or oscillating mirror scanning perpendicular to
the flight direction. Optical mechanical scanners are composed of an optical
system, spectrographic system, scanning system, detector system and
reference system. Multispectral scanner (MSS) and thematic mapper (TM)
of LANDSAT, and Advanced Very High-Resolution Radiometer (AVHRR) of
NOAA are the examples of optical mechanical scanners. M2S made by
Daedalus Company is an example of an airborne type of optical mechanical
scanner.

75 | P a g e
The function of the elements of an optical mechanical scanner are as
follows.
a. Optical system: Reflective telescope system such as Newton,
Cassegrain or Ritchey-Chretien is used to avoid colour aberration.
b. Spectrographic system: Dichroic mirror, grating, prism or filter are
utilized.

c. Scanning system: rotating mirror or oscillating mirror is used for


scanning perpendicular to the flight direction.
d. Detector system: Electromagnetic energy is converted to an electric
signal by the optical electronic detectors. Photomultiplier detectors
utilized in the near ultraviolet and visible region, silicon diode in the
visible and near infrared, cooled ingium antimony (InSb) in the short-
wave infrared, and thermal barometer or cooled Hq Cd Te in the
thermal infrared.
e. Reference system: The converted electric signal is influenced by a
change of sensitivity of the detector. Therefore, light sources or
thermal sources with constant intensity or temperature should be
installed as a reference for calibration of the electric signal.

Fig. 8.1 Data Acquisition by Optical Mechanical Scanner

76 | P a g e
Compared to the pushbroom scanner, the optical mechanical scanner has
certain advantages. For examples, the view angle of the optical system can
be very narrow, band to band registration error is small and resolution is
higher, while it has the disadvantage that signal to noise ratio (S/N) is rather
less because the integration time at the optical detector cannot be very long
due to the scanner motion.

8.3 Push-Broom Scanners


The pushbroom scanner has a linear array of detectors, in which each
detector measures the radiation reflected from a small area on the ground.
In this type of scanning system, linear array of detectors scan in the direction
parallel to the flight line. Linear arrays normally consist of numerous charge-
coupled devices (CCDs) positioned end to end. Charge-Coupled Devices
are designed to be very small, and a single array may contain over 10,000
individual detectors. Normally, the arrays are in the focal plane of the
scanner such that all scan lines are viewed by all arrays simultaneously.
This system is more reliable as against the scanning mirror of the transverse
system and the detectors are light and need little power to operate.

Fig. 8.2 Push-Broom Scanner


Let us Sum Up
An optical and mechanical scanner including a device for scanning a field
of vision to detect and recognize with high resolution distant objects in the
field of vision. Scanning is accomplished in two directions including a
direction x for line scanning and a direction y for raster or image scanning.

77 | P a g e
One embodiment of the scanning device includes, in order along the
direction of a path of a mean incident beam from the field of vision, an
objective, a raster scanning mirror for scanning in the y direction, a field
mirror which delimits the field of the objective in the x direction, a rotating
drum and an image transport system for line scanning in the x direction, the
field mirror deflecting the beams towards the drum, and a detector sensitive
to the radiation contained in the beams, the scanning device ensuring
convergence of the beams at the detector. A push broom scanner, also
known as an along-track scanner, is a device for obtaining images
with spectroscopic sensors. The scanners are regularly used for
passive remote sensing from space, and in spectral analysis on production
lines, for example with near-infrared spectroscopy used to identify
contaminated food and feed. The moving scanner line in a traditional
photocopier is also a familiar, everyday example of a push broom scanner.
Push broom scanners and the whisk broom scanners variant are often
contrasted with staring arrays (such as in a digital camera), which image
objects without scanning, and are more familiar to most people.

Glossary
Optical system: Reflective telescope system such as Newton, Cassegrain
or Ritchey-Chretien is used to avoid colour aberration.
Spectrographic system: Dichroic mirror, grating, prism or filter are utilized.
Scanning system: rotating mirror or oscillating mirror is used for scanning
perpendicular to the flight direction.
Detector system: Electromagnetic energy is converted to an electric signal
by the optical electronic detectors. Photomultiplier detectors utilized in the
near ultraviolet and visible region, silicon diode in the visible and near
infrared, cooled ingium antimony (InSb) in the short-wave infrared, and
thermal barometer or cooled Hq Cd Te in the thermal infrared.
Reference system: The converted electric signal is influenced by a change
of sensitivity of the detector. Therefore, light sources or thermal sources with
constant intensity or temperature should be installed as a reference for
calibration of the electric signal.

Check Your Progress


1. What is Optical-Mechanical Scanners
An optical mechanical scanner is a multispectral radiometer by which
two-dimensional imagery can be recorded using a combination of the
motion of the platform and a rotating or oscillating mirror scanning

78 | P a g e
perpendicular to the flight direction.
2. What is Push-Broom Scanners
The push broom scanner has a linear array of detectors, in which each
detector measures the radiation reflected from a small area on the
ground. In this type of scanning system, linear array of detectors scan
in the direction parallel to the flight line.

Suggested Readings
1. Campbell J., 1989. Introduction to Remote Sensing, Guilford, New
York.
2. Thomas M., Lillesand, Ralph, Kefer, W., 1994. Remote Sensing and
Image Interpretation, John Wiley & Sons, New York.
3. Campbell J., 1989. Introduction to Remote Sensing, Guilford, New
York.

4. Thomas M., Lillesand, Ralph, Kefer, W., 1994. Remote Sensing and
Image Interpretation, John Wiley & Sons, New York.
5. Sarkar, A. (2015): Practical Geography: A Systematic Approach.
Orient Black Swan Private Ltd., New Delhi.
6. Anji Reddy, M. (2008): Textbook of Remote Sensing and Geographic
Information System, B.S. Publication, Hyderabad.

7. Campbell, J. B., (2007): Introduction to Remote Sensing, Guildford


Press.
8. Chauniyal, D.D., (2010): Sudur Samvedanevam Bhogolik Suchana
Pranali (Hindi), Sharda Pustak Bhawan, Allahabad. 34.

Web Sources
1. http://www.drbrambedkarcollege.ac.in/sites/default/files/Remote%20
sensing%20platforms.pdf
2. https://eos.com/blog/types-of-remote-sensing/

79 | P a g e
Unit 9
Thermal Remote Sensing and Ideal Remote
Sensing Systems
Structure
Overview
Learning Objectives

9.1 Thermal Remote Sensing


9.1.1 Thermal Atmospheric Windows
9.1.2 Wavelength / Spectral Range

9.1.3 Spectral Emissivity and Kinetic Temperature


9.2 Emissivity
9.3 Thermal Sensors and Satellites
9.3.1 Thermal Sensors
9.4 Ideal Remote Sensing
Let us Sum Up
Glossary
Check Your Progress
Suggested Readings
Web Sources

Overview
The earth-atmosphere system derives its energy from the sun which, being
at a very high temperature, radiates maximum energy in the shorter
wavelengths (visible, 0.20 to 0.80 mm). The earth-atmosphere system
absorbs part of this energy (part due to its reflective properties due to
surface albedo, clouds, and other reflectors/scatterers in the atmosphere),
which in turn heats it up and raises its temperature. This temperature, being
in the range of 300 degrees Kelvin, will emit its own radiation in the longer
wavelengths called 'thermal infrared'. The observations in the thermal
wavelength of the electromagnetic spectrum (3-35 mm) are generally
referred to as thermal remote sensing. In this region the radiation emitted
by the earth due to its thermal state are far more intense than the solar

80 | P a g e
reflected radiation, therefore any sensor operating in this wavelength region
would primarily detect the thermal radiative properties of ground material.
All materials having a temperature above absolute zero (-273°C or 0°K) both
day and night emit Infrared energy. Infrared sensing refers to the detection
of remote objects by recording the amount of infrared energy emitted from
various surfaces as a continuous tone image on photographic film. Thermal
IR imagery is usually obtained in the wavelength regions 3 to 5.5mm and
from 8 to 14mm because of atmospheric absorption at other wavelengths.
Learning Objectives
After Learning this lesson, you will be able to:
• Understand the Thermal Remote Sensing
• Acquire the Knowledge of Ideal Remote Sensing

9.1 Thermal Remote Sensing


Thermal remote sensing refers to measuring the energy that is being
emitted from the Earth’s surfaces rather than measuring the reflected
energy. Therefore, it is important to remember that thermal infrared equates
to emitted infrared energy rather than reflected. Thermal remote sensing is
a type of passive remote sensing since it is measuring naturally emitted
energy. Thermal infrared remote sensing is used to measure land and
ocean temperatures, atmospheric temperatures and humidity and the
Earth's radiation balance.
Thermal remote sensing is the branch of remote sensing that deals with the
acquisition, processing and interpretation of data acquired primarily in the
thermal infrared (TIR) region of the electromagnetic (EM) spectrum. In
thermal remote sensing we measure the radiation 'emitted' from the surface
of the target, as opposed to optical remote sensing where we measure the
radiation 'reflected' by the target under consideration. Useful reviews on
thermal remote sensing are given by Kahle (1980), Sabins (1996) and
Gupta (1991). It is a well-known fact that all natural targets reflect as well as
emit radiation. In the TIR region of the EM spectrum, the radiations emitted
by the earth due to its thermal state are far more intense than the solar
reflected radiations and therefore, sensors operating in this wavelength
region primarily detect thermal radiative properties of the ground material.
However, as also discussed later in this article, very high temperature
bodies also emit substantial radiation at shorter wavelengths. As thermal
remote sensing deals with the measurement of emitted radiations, for high
temperature phenomenon, the realm of thermal remote sensing broadens
to encompass not only the TIR but also the short-wave infrared (SWIR),

81 | P a g e
near infrared (NIR) and in extreme cases even the visible region of the EM
spectrum.
Thermal remote sensing, in principle, is different from remote sensing in the
optical and microwave region. In practice, thermal data proves to be
complementary to other remote sensing data. Thus, though still not fully
explored, thermal remote sensing reserves potential for a variety of
applications.
9.1.1 Thermal Atmospheric Windows
While Thermal IR region extends from 3-14 μm, only portions of the
spectrum are suitable for remote sensing applications. There are several
atmospheric windows in the thermal portion of the spectrum, but none of the
windows transmits 100 % of the emitted radiation. Water vapor and carbon
dioxide absorb some of the energy across the spectrum and ozone absorbs
energy specifically in the 10.5-12.5 μm range. The gases and particles in
the atmosphere also absorb incoming radiation and emit their own thermal
energy. Most thermal sensing is performed in the 8-14 μm region of the
spectrum not only because it includes an atmospheric window, but because
it contains the peak energy emissions for most of Earth’s surface features.
9.1.2 Wavelength / Spectral Range
The infrared portion of the electromagnetic spectrum is usually considered
to be from 0.7 to 1,000 µm. Within this infrared portion, there are various
nomenclatures and little consensus among various groups to define the sub
boundaries. In terrestrial remote sensing the region of 3 to 35 µm is
popularly called thermal infrared. As in all other remote sensing missions,
data acquisitions are made only in regions of least spectral absorption
known as the atmospheric windows. Within the thermal infrared an excellent
atmospheric window lies between 8-14 µm wavelength. Poorer windows lie
in 3-5 µm and 17-25 µm. Interpretation of the data in 3-5 µm is complicated
due to overlap with solar reflection in day imagery and 17-25 µm region is
still not well investigated. Thus 8-14 µm region has been of greatest interest
for thermal remote sensing.
9.1.3 Spectral Emissivity and Kinetic Temperature
Thermal remote sensing exploits the fact that everything above absolute
zero (0 K or -273.15 °C or –459 °F) emits radiation in the infrared range of
the electromagnetic spectrum. How much energy is radiated, and at which
wavelengths, depends on the emissivity of the surface and on its kinetic
temperature. Emissivity is the emitting ability of a real material compared to
that of a black body and is a spectral property that varies with composition

82 | P a g e
of material and geometric configuration of the surface. Emissivity denoted
by epsilon (ε) is a ratio and varies between 0 and 1. For most natural
materials, it ranges between 0.7 and 0.95. Kinetic temperature is the surface
temperature of a body/ground and is a measure of the amount of heat
energy contained in it. It is measured in different units, such as in Kelvin (K);
degrees Centigrade (°C); degrees Fahrenheit (°F).

9.2 Emissivity
Objects in the real world are not perfect blackbodies. Not all of the incident
energy upon them is absorbed, therefore they are not perfect emitters of
radiation. The emissivity (ε) of a material is the relative ability of its surface
to emit heat by radiation. Emissivity is defined as the ratio of the energy
radiated from an object's surface to the energy radiated from a blackbody
at the same temperature.
Emissivity values can range from 0 to 1. A blackbody has an emissivity of
1, while a perfect reflector or whitebody has an emissivity of 0. Most natural
objects are considered "graybodies" as they emit a fraction of their
maximum possible blackbody radiation at a given temperature. Water has
an emissivity close to 1 and most vegetation also has an emissivity close to
1. Many minerals and metals have emissivities significantly less than 1.
Depending on the material, emissivity can also vary depending on its
temperature. Below are emissivities for some common materials.

Material Emissivity

Aluminum (anodized) 0.77

Aluminum (polished) 0.05

Concrete 0.92

Copper (polished) 0.05

Copper (oxidized) 0.65

Glass 0.92

Gypsum 0.08

Ice 0.97

Sand 0.9

83 | P a g e
Snow 0.8

Soil (Dry) 0.92

Soil (Saturated) 0.95

Stainless Steel 0.59

Water 0.95

The emissivity of a surface depends not only on the material but also on the
nature of the surface. For example, a clean and polished metal surface will
have a low emissivity, whereas a roughened and oxidized metal surface will
have a high emissivity. Two materials lying next to one another on the
ground could have the same true kinetic temperature but have different
apparent radiant temperatures when sensed by a thermal radiometer simply
because their emissivities are different. Emissivity can be used to identify
mineral composition. Knowledge of surface emissivity is also important for
both accurate true kinetic temperature measurements from radiometers.
Many materials (graybodies) have an emissivity less than 1 and this
emissivity is constant across all wavelengths (see above graph). For any
given wavelength the emitted energy of a graybody is a fraction of that of a
blackbody. The emissivity of some objects varies depending on the
wavelength. These objects are referred to as selective radiators or as being
selectively radiant. The emissivity of such materials can vary greatly
depending on the wavelength. Some materials may behave like blackbodies
at certain wavelengths (ε close to 1) but may have reduced emissivity at
other wavelengths. The below graph shows how the emissivity varies
across the wavelengths for two materials, quartz, and feldspar. Both
materials are selective radiators, but quartz has considerably more variation
in emissivity at different wavelengths.

9.3 Thermal Sensors and Satellites


Thermal sensors or scanners detect emitted radiant energy. Due to
atmospheric effects these sensors usually operate in the 3 to 5 μm or 8 to
14μm range. Most thermal remote sensing of Earth features is focused in
the 8 to 14 μm range because peak emission (based on Wien's Law) for
objects around 300K (27° C or 80° F) occurs at 9.7μm. Many thermal
imaging sensors are on satellite platforms, although they can also be
located on-board aircraft or on ground-based systems. Many thermal

84 | P a g e
systems are multispectral, meaning they collect data on emitted radiation
across a variety of wavelengths.
9.3.1 Thermal Sensors
a. Thermal Infrared Multispectral Scanner (TIMS)
NASA and the Jet Propulsion Laboratory developed the Thermal Infrared
Multispectral Scanner (TIMS) for exploiting mineral signature information.
TIMS is a multispectral scanning system with six different bands ranging
from 8.2 to 12.2 μm and a spatial resolution of 18m. TIMS is mounted on an
aircraft and was primarily designed as an airborne geologic remote sensing
tool. TIMS acquires mineral signature data that permits the discrimination of
silicate, carbonate and hydrothermally altered rocks. TIMS data have been
used extensively in volcanology research in the western United States,
Hawaiian Islands and Europe. Multispectral data allows for the generating
of three-band color composites like other multispectral data. Many materials
have varying emissivities and can be identified by the variation in emitted
energy.
The thermal image to the right was captured the Thermal Infrared
Multispectral Scanner (TIMS) and is a thermal image of Death Valley
California. A color composite has been produced using three thermal bands
collected by TIMS. There are a variety of different materials and minerals in
Death Valley with varying emissivities. In this image Thermal Band 1 (8.2 -
8.6μm) is displayed in blue, Thermal Band 3 (9.0 - 9.4μm) is displayed in
green and Thermal Band 5 (10.2 - 11.2 μm) is displayed in red. Alluvial fans
appear in shades of reds, lavender, and blue greens; saline soils in yellow;
and different saline deposits in blues and greens.
b. Advanced Spaceborne Thermal Emission and Reflection
Radiometer (ASTER)

Advanced Spaceborne Thermal Emission and Reflection Radiometer


(ASTER) is a sensor on-board the Terra satellite. In addition to collecting
reflective data in the visible, bear and shortwave infrared, ASTER also
collects thermal infrared data. ASTER has five thermal bands ranging from
8.1 to 11.6 μm with 90m spatial resolution. ASTER data are used to create
detailed maps of surface temperature of land, emissivity, reflectance, and
elevation. ASTER data is available for download through EarthExplorer.
c. Moderate-resolution Imaging Spectroradiometer (MODIS)
As previously discussed, MODIS has a high spectral resolution and collects
data in a variety of wavelengths. Like ASTER, MODIS collects reflective
data and emits thermal data. MODIS has several bands that collect thermal

85 | P a g e
data with 1000m spatial resolution. MODIS has high temporal resolution
with a one-to-two-day return time. This makes it an excellent resource for
detecting and monitoring wildfires. One of the products generated from
MODIS data is the Thermal Anomalies/Fire product which detects hotspots
and fires.
d. Landsat

A variety of the Landsat satellites have carried thermal sensors. The first
Landsat satellite to collect thermal data was Landsat 3, however this part of
the sensor failed shortly after the satellite was launched. Landsat 4 and 5
included a single thermal band (band 6) on the Thematic Mapper (TM)
sensor with 120m spatial resolution that has been resampled to 30m. A
similar band was included on the Enhanced Thematic Mapper Plus (ETM+)
on Landsat 7. Landsat 8 includes a separate thermal sensor known the
Thermal Infrared Sensor (TIRS). TIRS has two thermal bands, Band 10
(10.60 - 11.19μm) and Band 11 (11.50 - 12.51μm). The TIRS bands are
acquired at 100 m spatial resolution but are resampled to 30m in the
delivered data products.
e. Landsat TIRS and Applications
Irrigation accounts for 80% of freshwater use in the U.S and water usage
has become an increasingly important issue, particularly in the West.
Thermal infrared data from Landsat 8 is being used to estimate water use.
Landsat 8 data, including visible, near infrared, mid-infrared, and thermal
data are fed into a relatively sophisticated energy balance model that
produces evapotranspiration maps. Evapotranspiration (ET) refers to the
conversion of water into water vapor by the dual process of evaporation
from the soil and transpiration (the escape of water though plant’s stomata).
For vegetated land, ET is synonymous with water consumption. Landsat
data enable water resources managers and administrators to determine
how much water was consumed from individual fields.

9.4 Ideal Remote Sensing


The basic components of an ideal remote sensing system are as follows:

A Uniform Energy Source which provides energy over all wavelengths, at a


constant, known, high level of output.
A Non-interfering Atmosphere which will not modify either the energy
transmitted from the source or emitted (or reflected) from the object in any
manner.

86 | P a g e
A Series of Unique Energy/Matter Interactions at the Earth's Surface which
generate reflected and/or emitted signals that are selective with respect to
wavelength and unique to each object or earth surface feature type.
A Super Sensor which is highly sensitive to all wavelengths. A super sensor
would be simple, reliable, accurate, economical, and requires no power or
space. This sensor yields data on the absolute brightness from a scene as
a function of wavelength.
A Real-Time Data Handling System which generates the instance radiance
versus wavelength response and processes into an interpretable format in
real time. The data derived is unique to a particular terrain and hence
provides insight into its physical chemical-biological state.
Multiple Data Users having knowledge in their respective disciplines and in
remote sensing data acquisition and analysis techniques. The information
collected will be available to them faster and at less expense. This
information will aid the users in various decision-making processes and
further in implementing these decisions.

Fig. 9.1 Ideal Remote Sensing System with Components

Let us Sum Up
Thermal remote sensing, in principle, is different from remote sensing in the
optical and microwave region. In practice, thermal data proves to be
complementary to other remote sensing data. Thus, though still not fully
explored, thermal remote sensing reserves potential for a variety of
applications. In terrestrial remote sensing the region of 3 to 35 µm is
popularly called thermal infrared. As in all other remote sensing missions,
data acquisitions are made only in regions of least spectral absorption
known as the atmospheric windows. Emissivity is the emitting ability of a

87 | P a g e
real material compared to that of a black body and is a spectral property that
varies with composition of material and geometric configuration of the
surface.

Glossary
A Uniform Energy Source which provides energy over all wavelengths, at a
constant, known, high level of output.
A Non-interfering Atmosphere which will not modify either the energy
transmitted from the source or emitted (or reflected) from the object in any
manner.
A Series of Unique Energy/Matter Interactions at the Earth's Surface which
generate reflected and/or emitted signals that are selective with respect to
wavelength and unique to each object or earth surface feature type.

Check Your Progress


1. What is a Super Sensor
Super Sensor which is highly sensitive to all wavelengths. A super
sensor would be simple, reliable, accurate, economical, and requires no
power or space. This sensor yields data on the absolute brightness (or
radiance) from a scene as a function of wavelength.
2. What is a Real-Time Data Handling System
It generates the instance radiance versus wavelength response and
processes it into an interpretable format in real time. The data derived is
unique to a particular terrain and hence provides insight into its physical
chemical-biological state.

Suggested Readings
1. Campbell J., 1989. Introduction to Remote Sensing, Guilford, New
York.
2. Thomas M., Lillesand, Ralph, Kefer, W., 1994. Remote Sensing and
Image Interpretation, John Wiley & Sons, New York.
3. Campbell J., 1989. Introduction to Remote Sensing, Guilford, New
York.

4. Thomas M., Lillesand, Ralph, Kefer, W., 1994. Remote Sensing and
Image Interpretation, John Wiley & Sons, New York.
5. Sarkar, A. (2015): Practical Geography: A Systematic Approach.
Orient Black Swan Private Ltd., New Delhi.

88 | P a g e
6. Anji Reddy, M. (2008): Textbook of Remote Sensing and Geographic
Information System, B.S. Publication, Hyderabad.
7. Chauniyal, D.D., (2010): Sudur Samvedanevam Bhogolik Suchana
Pranali (Hindi), Sharda Pustak Bhawan, Allahabad. 34.

Web Sources
http://gsp.humboldt.edu/olm/Courses/GSP_216/lessons/thermal/sensors.h
tml

89 | P a g e
BLOCK 4

Fundamentals of Aerial Remote Sensing


Unit: 10 Aerial Photo Imaging System and Types of Aerial
Photographs

Unit: 11 Marginal Information of Aerial Photographs

Unit: 12 Elements of Photo Interpretation

90 | P a g e
Unit 10
Aerial Photo Imaging System and Types of
Aerial Photographs
Structure
Overview
Learning Objectives
10.1 Aerial Photo Imaging System
10.2 Geometry of an Aerial Photograph
10.3 Scales of Aerial Photograph

10.4 Types of Aerial Photographs


10.5 Vertical Vs Oblique Photographs
Let us Sum Up
Glossary
Check Your Progress
Suggested Readings
Web Sources

Overview
The geometry of an aerial photograph is based on the simple, fundamental
condition of collinearity. Three or more points that lie on the same line are
said to be collinear. In photogrammetry, a single ray of light is the straight
line; three fundamental points must always fall on this straight line: the
imaged point on the ground, the focal point of the camera lens, and the
image of the point on the film or imaging array of a digital camera. The length
of each ray, from the focal point of the camera to the imaged point on the
ground, is determined by the height of the camera lens above the ground
and the elevation of that point on the ground.
Learning Objectives
After Learning this lesson, you will be able to:
• Measure Photo Coordinates and Relate them to Ground
Coordinates
• Know about Scales of Aerial Photograph

91 | P a g e
• Understand Aerial Photographs and their Types.
• Acquire the Knowledge and Compare Between the Types of Aerial
Photograph

10.1 Aerial Photo Imaging System


This is the most common form of remote sensing data. First commercial
companies to do aerial survey come into existent in the early 1920. Then
lots of experiments were made to improve the surveying by using different
image formats, camera, lens and films.
Multispectral camera is mounted in the aircraft in conjunction with a survey
camera.

Generally monochrome films are used because it is simple to process and


does not need extremely rigorous laboratory conditions.
This film is available in number of types and speeds which allow adequate
photography to be taken over a wide range of weather and lighting
conditions.
The exposed film must be sent to a fully equipped laboratory; the
photographic cover cannot be checked within hours of being taken.

10.2 Geometry of an Aerial Photograph


To understand the geometry of an aerial photograph, it is important to
appreciate the orientation of the photograph with respect to the ground, i.e.
the way the rays connect or ‘project’ onto the ground in relation to the ground
representation (photograph or map). The following three examples of such
projection would be useful in understanding the problem.
Parallel Projection: In this projection, the projecting rays are parallel but
not necessarily perpendicular. The triangle ABC is projected on LL1 as
triangle abc (Fig. 10.1).

Fig. 10.1 Parallel Projection

92 | P a g e
Orthogonal Projection: This is a special case of parallel projections. Maps
are orthogonal projections of the ground. The advantage of this projection
is that the distances, angles or areas on the plane are independent of the
elevation differences of the objects. This is an example of orthogonal
projection where the projecting rays are perpendicular to the line LL1 (Fig.
10.2)

Fig. 10.2 Orthogonal projection

Central Projection: The projecting rays Aa, Bb and Cc pass through a


common point O, which is called the perspective Centre. The image
projected by a lens is treated like a central projection. In an absolutely
vertical flat terrain, the aerial photograph will be geometrically the same as
the corresponding map of the area. However, because of the tilt of the
photograph and relief variations of the ground photographed, an aerial
photograph differs geometrically from the map of the corresponding area
(Fig. 10.3).

Fig. 10.3 Central Projection


Geometry of Vertical Photograph: It needs to be understood here that SP,
i.e., the perpendicular distance between the camera lens and the negative
plane is known as the focal length. On the other hand, SPG, i.e., the

93 | P a g e
perpendicular distance between the camera lens and the ground
photographed is known as the flying height.

Fig. 10.4 Geometry of Vertical Photograph

10.3 Scales of Aerial Photograph


Scale may be expressed in three ways:
• Unit Equivalent
• Representative Fraction
• Ratio
A photographic scale of 1 millimetre on the photograph represents 25
metres on the ground would be expressed as follows:
• Unit Equivalent - 1 mm = 25 m
• Representative Fraction - 1/25 000
• Ratio - 1:25 000
Types of Aerial Photographs Based on Scale:
Aerial photographs may also be classified based on the scale of photograph
into three types.

Large Scale Photographs: When the scale of an aerial photograph is 1:


15,000 and larger, the photography is classified as a large-scale
photograph. cover small areas in greater detail. A large-scale photo simply
means that ground features are at a larger, more detailed size. The area of
ground coverage that is seen in the photo is less than at smaller scales.

94 | P a g e
Medium Scale Photographs: Aerial photographs with a scale ranging
between 1:15,000 and 1:30,000 are usually treated as medium scale
photographs.
Small Scale Photographs: The photographs with the scale being smaller
than 1: 30,000, are referred to as small scale photographs. Cover large
areas in less detail. A small-scale photo simply means that ground features
are at a smaller, less detailed size. The area of ground coverage that is seen
in the photo is greater than at larger scales.
Scale: the ratio of the distance between two points on a photo to the actual
distance between the same two points on the ground (i.e. 1 unit on the photo
equals "x" units on the ground). If a 1 km stretch of highway covers 4 cm on
an air photo, the scale is calculated as follows:
𝑃𝑃ℎ𝑜𝑜𝑜𝑜𝑜𝑜 𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑/ 𝐺𝐺𝐺𝐺𝐺𝐺𝐺𝐺𝐺𝐺𝐺𝐺 𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑 = 4 𝑐𝑐𝑐𝑐/1 𝑘𝑘𝑘𝑘 = 4 𝑐𝑐𝑐𝑐/100000 𝑐𝑐𝑐𝑐 = 1/25000
So the scale is: 1/25000
The second method used to determine the scale of a photo is to find the
ratio between the camera's focal length and the plane's altitude above the
ground being photographed.

If a camera's focal length is 152 mm, and the plane's altitude Above Ground
Level (AGL) is 7 600 m, using the same equation as above, the scale would
be: 𝐹𝐹𝐹𝐹𝐹𝐹𝐹𝐹𝐹𝐹 𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙ℎ/𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴 = 152 𝑚𝑚𝑚𝑚/7600 𝑚𝑚 = 152 𝑚𝑚𝑚𝑚/57600000 𝑚𝑚𝑚𝑚 =
1/50000 So the scale is: 1/50000
Third methods of Scale of the photograph can also be calculated if we know
focal length of camera and height of aircraft above the ground level.

Scale = f/H-h

95 | P a g e
Where, H = flying height of aircraft above sea level, h = height of ground
above sea level and f is focal length.

If a camera's focal length is 152 mm, and the plane's altitude Above Ground
Level (AGL) is 7 600 m, using the same equation as above, the scale would
be:

10.4 Types of Aerial Photographs


Types of Aerial Photographs based on the Position of the Cameral Axis:
Based on the position of the camera axis, aerial photographs are classified
into the following types:

Vertical Photographs:
While taking aerial photographs, two distinct axes are formed from the
camera lens centre, one towards the ground plane and the other towards
the photo plane. The perpendicular dropped from the camera lens centre to
the ground plane is termed as the vertical axis, whereas the plumb line
drawn from the lens centre to the photo plane is known as the
photographic/optical axis. When the photo plane is kept parallel to the
ground plane, the two axes also coincide with each other. The photograph
so obtained is known as vertical aerial photograph. However, it is normally
very difficult to achieve perfect parallelism between the two planes because
the aircraft flies over the curved surface of the earth. The photographic axis,
therefore, deviates from the vertical axis. If such a deviation is within the
range of plus or minus 3o, the near-vertical aerial photographs are obtained.
Any photography with an unintentional deviation of more than 3o in the
optical axis from the vertical axis is known as a tilted photograph (Fig. 10.5).

96 | P a g e
Fig. 10.5 Vertical Aerial photograph
Low Oblique: An aerial photograph taken with an intentional deviation of 15°
to 30° in the camera axis from the vertical axis is referred to as the low
oblique photograph. This kind of photograph is often used in
reconnaissance surveys (Fig. 10.6).

Fig. 10.6 Low-Oblique Photograph

97 | P a g e
High Oblique: The high oblique are photographs obtained when the camera
axis intentionally inclined about 60° from the vertical axis. Such photography
is useful in reconnaissance surveys (Fig. 10.7).

Fig. 10.7 High Oblique

10.5 Comparison between Vertical and


Oblique Photographs

Attributes Vertical Low Oblique High Oblique

Tilt < 3° i.e. Deviation is Deviates by axis


exactly or <300 from the > 30O from
Optical Axis
nearly Vertical Axis. Vertical Axis.
coincides with
the Vertical
axis.

Characteristics Horizon does Horizon does Horizon Appears


not appear. appear.

Coverage Small Area Relatively larger Largest Area


Area

Shape of the Square Trapezoidal Trapezoidal


Area

Photographed Uniform, if the Decreases from Decreases from


Scale terrain is flat foreground to the foreground to
background the background

98 | P a g e
Difference in Relatively
Comparison to Greater
Least Greatest
the Map

Advantages Useful in
Topographical
Reconnaissance Illustrative
and thematic
Survey
mapping

Let us Sum Up
A photographic image is a central perspective. Here implies that every light
ray, which reaches the film surface during exposure, passed through the
camera lens. Photography implies the position of all the points is controlled
by one single point of the image, which controls the geometry of the entire
photographs. Principal point (PP)- in the point on the image where the
optical axis intersects the image plane. The Optical axis is an imaginary line
that passes through the optical center of the lens and perpendicular to the
film or image plane. The Distance between the perspective center and
principal point determines the Focal length.

Glossary
Vertical Photographs: While taking aerial photographs, two distinct axes are
formed from the camera lens centre, one towards the ground plane and the
other towards the photo plane.
Low Oblique: An aerial photograph taken with an intentional deviation of 15°
to 30° in the camera axis from the vertical axis is referred to as the low
oblique photograph. This kind of photograph is often used in
reconnaissance surveys.
High Oblique: The high oblique are photographs obtained when the camera
axis intentionally inclined about 60° from the vertical axis. Such photography
is useful in reconnaissance surveys.

Check Your Progress


1. What are Scales of Aerial Photographs
• Large Scale Photographs
• Medium Scale Photographs
• Small Scale Photographs
2. What are the types of Aerial Photograph?

99 | P a g e
• Vertical photographs
• Low oblique photographs
• High oblique photographs

Suggested Readings
1. Campbell J., 1989. Introduction to Remote Sensing, Guilford, New
York.
2. Thomas M., Lillesand, Ralph, Kefer, W., 1994. Remote Sensing and
Image Interpretation, John Wiley & Sons, New York.
3. Campbell J., 1989. Introduction to Remote Sensing, Guilford, New
York.

4. Thomas M., Lillesand, Ralph, Kefer, W., 1994. Remote Sensing and
Image Interpretation, John Wiley & Sons, New York.
5. Sarkar, A. (2015): Practical Geography: A Systematic Approach.
Orient Black Swan Private Ltd., New Delhi.
6. Anji Reddy, M. (2008): Textbook of Remote Sensing and Geographic
Information System, B.S. Publication, Hyderabad.

7. Chauniyal, D.D., (2010): Sudur Samvedanevam Bhogolik Suchana


Pranali (Hindi), Sharda Pustak Bhawan, Allahabad. 34.

Web Sources
1. http://gsp.humboldt.edu/olm/Courses/GSP_216/lessons/thermal/senso
rs.html

100 | P a g e
Unit 11
Marginal Information of Aerial Photographs
Structure
Overview
Learning Objectives
11.1 Marginal Information of Aerial Photographs

11.1.1 Fiducial Marks:


11.1.2 Task Number
11.1.3 Agency Number

11.1.4 Photographic Number:


11.1.5 Strip Number:
11.1.6 Principal Point:
11.1.7 Scale:
11.1.8 Transfer Point:
11.1.9 Focal Length:
11.1.10 Altimeter:
Let us Sum Up
Glossary
Check Your Progress
Suggested Readings
Web Sources

Overview
The word `Photography` means `writing with Light. Aerial photograph
means taking pictures of the earth from air. A photograph is the signature
of energy emitted or reflected by air object on photographic films. The
photographic interpretation is “the act of examining photographic images for
the purpose of identifying objects and judging their significance.” The most
important principle of photo interpretation is the observation and secondly
the capacity to use logical modes of thought to draw correct conclusion from
the things observed. Aerial photographs provide a unique tool. They cover
a large area on earth’s surface. Overlapping pairs of photographs provide a

101 | P a g e
three-dimensional view of the object photographed. Images on aerial
photographed are permanent and unbiased representation of objects
occurring on earth surface. The large area photographed enables a photo
interpreter to perceive relations between objects and their background.
Learning Objectives
After Learning this lesson, you will be able to:
• Know about Marginal Information of Aerial Photograph
• Understand Aerial Photographs

11.1 Marginal Information of Aerial Photographs


Aerial photographs contain a detailed record of features on the ground at
the time of exposure. An interpretation is made as to the physical nature of
object and phenomena appearing in the photographs. Interpretation may
take place at a number of levels of complexity, from the simple recognition
of objects on the earth’s surface to the derivation of detailed information
regarding the complex interactions among earth surface and subsurface
features.

102 | P a g e
Fig. 11.1 Marginal Information given on Vertical Aerial Photographs

(Credit: NCERT)
A: Fiducial Marks B: Photo Specifications
C: Tilt Indicator D: Flying Height Indicator
793 is a Photo Specification number maintained by the 73 APFPS Party of
the Survey of India. B is the Flying Agency that carried out the present
photography (In India three flying agencies are officially permitted to carry
out aerial photography. They are the Indian Air Force, the Air Survey
Company, Kolkata, and the National Remote Sensing Agency, Hydrabad,
identified on the aerial photographs as A, B and C respectively), 5 is the
strip number and 23 is the photo number in strip 5.
11.1.1 Fiducial Marks:
A fiducial marker or fiducial is an object placed in the field of view of
an imaging system that appears in the image produced, for use as a point
of reference or a measure. It may be either something placed into or on the
imaging subject, or a mark or set of marks in the reticle of an optical
instrument.

103 | P a g e
Fiducial marks are small registration marks exposed on the edges of a
photograph. The distances between fiducial marks are precisely measured
when a camera is calibrated. They are helpful to locating Principal Point.
These marks are also called collimating marks.

11.1.2 Task Number


Every aerial photograph is given a task number by Survey of India (SOI).

11.1.3 Agency Number


In India, aerial photography’s works by three agencies: 1. Indian Air Force
2. M/S Air Survey Company – Dumdum (Kolkata) 3. NRSC (National
Remote Sensing Centre). Survey of India (SOI) has given codes of these
agencies. For example, Indian Air Force’s code is `A`, M/S Air survey
company’s code is `B` and the code of NRSC is `C`.

11.1.4 Photographic Number:


Aerial Photographs are numbered serially along the strip or flight direction.
When strips are flown East-West photographs are numbered from West

104 | P a g e
towards East. When strips are North-South, they are numbered from South
towards North.

11.1.5 Strip Number:


A photographic task includes number of strips or runs. If strips are flown
East-West, they are numbered successively from North-South then they are
numbered consecutively from West, starting with strip number 1 with East
increasing order.

11.1.6 Principal Point:


The intersection point of the fiducial marks on a photograph called Principal
Point. The Principal Point denotes the position of camera vertically above
the photography.
11.1.7 Scale:
Scale is correct on the Principal Point. When we go away from the Principal
Point, scale is exaggerated.

105 | P a g e
11.1.8 Transfer Point:
When we transfer the principal point on other aerial photograph, the point
does not come on photograph. It is on any object or building. Or find out the
principal point on other photograph with fiducial marks, than we make 3D
and after we transfer the point. Transfer Point does not come in the centre
but the Transfer Point will come in side.
11.1.9 Focal Length:
Focal Length is the distance between negative plane and optical centre. It
has shown on aerial photograph right side of watch. Generally, it is in
millimetre. As focal length increases, image distortion decreases. The focal
length is precisely measured when the camera is calibrated. The focal
Length helps for calculating the scale.
11.1.10 Altimeter:
Altimeter means height from above mean sea level. Average height also
must be any area. It has shown on aerial photograph left side of watch.

Let us Sum Up
Aerial photography or airborne imagery is the taking of photographs from
an aircraft or other flying object. Aerial photography is used in cartography
(particularly in photogrammetric surveys, which are often the basis for
topographic maps), land-use planning, archaeology, movie production,
environmental studies, power line inspection, surveillance, commercial
advertising, conveyancing, and artistic projects.

Glossary
Fiducial Marks: Fiducial marks are small registration marks exposed on the
edges of a photograph.
Photographic Number: Aerial Photographs are numbered serially along the
strip or flight direction.
Altimeter: Altimeter means height from above means sea level. Average
height also must be any area. It is shown on aerial photograph left side of
watch.

Check Your Progress


1. What are the marginal information of Aerial photograph
Fiducial Marks, Task Number, Agency Number, Photographic Number, Strip
Number, Principal Point, Scale, Transfer Point, Focal Length, Altimeter

106 | P a g e
Suggested Readings
1. Campbell J., 1989. Introduction to Remote Sensing, Guilford, New
York.
2. Thomas M., Lillesand, Ralph, Kefer, W., 1994. Remote Sensing and
Image Interpretation, John Wiley & Sons, New York.
3. Campbell J., 1989. Introduction to Remote Sensing, Guilford, New
York.

4. Thomas M., Lillesand, Ralph, Kefer, W., 1994. Remote Sensing and
Image Interpretation, John Wiley & Sons, New York.
5. Sarkar, A. (2015): Practical Geography: A Systematic Approach.
Orient Black Swan Private Ltd., New Delhi.
6. Anji Reddy, M. (2008): Textbook of Remote Sensing and Geographic
Information System, B.S. Publication, Hyderabad.

7. Chauniyal, D.D., (2010): Sudur Samvedanevam Bhogolik Suchana


Pranali (Hindi), Sharda Pustak Bhawan, Allahabad. 34.

Web Sources
1. https://www.ijsr.net/archive/v3i9/U0VQMTQ1MDM=.pdf

107 | P a g e
Unit 12
Elements of Photo Interpretation
Structure
Overview
Learning Objectives
12.1 Photo Interpretation

12.2 Photo-Interpretation Keys


12.2.1 Photographic Tone
12.2.2 Photographic Texture

12.2.3 Shape of object


12.2.4 Size of object
12.2.5 Pattern
12.2.6 Site
12.2.7 hadow
12.2.8 Association
Let us Sum Up
Glossary
Check Your Progress
Suggested Readings
Web Sources

Overview
The analysis of remote sensing imagery includes the recognizable proof of
different targets in an image, and those targets may be environmental or
artificial features which comprise of points, lines, or areas. Targets reflect or
emit radiation. This radiation is estimated and recorded by a sensor, and at
last is depicted as an image like an air photo or a satellite image. An image
interpreter explores aerial or satellite imagery for the purpose of making use
of it, identifying the features, and evaluating their significance. Image
interpretation strategies were grown logically over 100 years, initially
focusing on military applications, and later reaching out to a wide range of

108 | P a g e
uses for scientific and commercial use. The process of extracting qualitative
and quantitative information about objects from aerial photographs or
satellite images is known as interpretation. Aerial photo interpretation is the
process of interpreting aerial photographs. One must rely totally on the
abilities of a human analyst, sometimes known as a photo interpreter.
Learning Objectives
After Learning this lesson, you will be able to:
• Know about Elements of Photo Interpretation
• Aerial Photographic Interpretation
• Basic Characteristics of Interpretation

12.1 Photo Interpretation


During the process of interpretation, the aerial photo
interpreters usually make use of the following tasks.
• Detection: Selectively picking out objects
• Recognition and identification: Naming objects or areas

• Analysis and deduction: Detect the spatial order of the objects and
predict the occurrence of certain relationships.
• Classification: To arrange the objects and elements identified into
an orderly system
• Accuracy determination: Field to confirm the interpretation.
Kinds and amounts of information that could be obtained from aerial
photographs depend primarily on
• Type of terrain
• Climatic environment
• Stage of the geomorphic cycle.

12.2 Photo-Interpretation Keys


• Photographic Tone
• Photographic Texture
• Shape of the objects
• Size of the objects
• Pattern
• Scale of photographs

109 | P a g e
• Vertical Exaggeration
12.2.1 Photographic Tone
Measure of relative amount of light reflected by an object and recorded on
the photograph. It refers to the relative brightness or colour of objects in an
image.
Photographic tone influenced by Reflectivity of an object, Angle of the
reflected light, Geographic latitude, Type of photography and film sensitivity,
Light transmission of filters, and Photographic processing.

Tone
12.2.2 Photographic Texture
Signifies the frequency of change and arrangement of tones in a
photographic image. Texture is produced by an aggregation of unit features.
It determines the overall visual “smoothness” or “coarseness”. Texture
distinguish two objects with the same tone.Texture is dependent on the
scale of aerial photograph. As the scale is reduced the texture progressively
becomes finer and ultimately disappears.

Coarse texture: Clustered objects with rounded crowns

Medium texture: Scattered objects

Fine Texture: Thick object density with small crown

Smooth Texture: Regular dispersion of uniform objects

110 | P a g e
Rough Texture: Irregularly dissipated object

Rippled Texture: Develop due to water wave over.

shallow water surface Mottled Texture: Pitted outwash plain

Granular Texture: Poor and loosely scattered objects

12.2.3 Shape of object


Qualitative statement referring to the general form, configuration, or
outline of an object. Certain geomorphic features are identified directly
based on their shape of not much eroded. Eg. Folds, Linear intrusives,
Massive intrusives.
12.2.4 Size of object
The size of an object is a function of photo scale and considered in
combination of size and shape of object. The sizes can be estimated by
comparing them with objects whose sizes are known. Eg. Small storage
shed vs Mine pit.
12.2.5 Pattern

Patterns are the spatial arrangement of objects and give genetic relation.
The orderly repetition of aggregate features in certain geometric or
planimentric arrangements. Ex. Fold pattern, drainage pattern, outcrop, and
lithological pattern.
Miscellaneous keys Association
The occurrence of certain features in relation to others. Eg. River and
drainage, buildings and roads, open cast mine and trenches.

111 | P a g e
12.2.6 Site
Statement of an object’s position in relation to others in its vicinity and
usually aids in its identification. Eg. certain vegetations or tree species are
expected to occur on well drained uplands or in certain countries).
S12.2.7 hadow

Shadows of objects aid in their identification. The shape or outline of


shadow affords an impression of the profile view of objects (which aids in
interpretation).
12.2.8 Association

Association takes into account the relationship between other recognizable


objects or features in proximity to the target of interest. The identification of
features that one would expect to associate with other features may provide
information to facilitate identification. In the example given above,
commercial properties may be associated with proximity to major
transportation routes, whereas residential areas would be associated with
schools, playgrounds, and sports fields. In our example, a lake is associated
with boats, a marina, and adjacent recreational land.

Let us Sum Up
The process of analysing and extracting valuable information from satellite
photographs is known as aerial photo interpretation.
The assignment of objects, characteristics, or locations to classes based on
their appearance on imagery is known as classification.
Enumeration is the process of listing or counting discrete objects seen on
an image.

112 | P a g e
The features of image aspects of aerial photography, such as tone, size,
shape, texture, pattern, shadow, site, and association are used to extract
information from aerial photography.
The association is the relationship between other recognised items or
features near the target of interest.

Glossary
The features of image aspects of aerial photography, such as tone, size,
shape, texture, pattern, shadow, site, and association, are used to extract
information from aerial photography.
The detection and measurement of light waves in the optical region of the
electromagnetic spectrum is known as radiometry.
Photometry is the study of measuring light, which is perceived by the human
eye in terms of brightness.

Check Your Progress


1. What is interpretation?
The process of extracting qualitative and quantitative information about
objects from aerial photographs or satellite images is known as
interpretation.
2. How does the aerial photo vary from regular photograph? Above
perspective

Outside the visible light spectrum


Scales and orientations that are unknown.
3. Define hue.

Tone is also known as Hue or Colour which refers to the relative


brightness of the elements on an aerial photograph.
4. Define pattern with an example.
The spatial arrangement of objects is known as pattern. The
recognizable pattern is formed by orderly repetition of similar tones and
textures. Urban streets with regularly spaced houses are the best
example of pattern.

Suggested Readings
1. Campbell J., 1989. Introduction to Remote Sensing, Guilford, New
York.

113 | P a g e
2. Thomas M., Lillesand, Ralph, Kefer, W., 1994. Remote Sensing and
Image Interpretation, John Wiley & Sons, New York.
3. Campbell J., 1989. Introduction to Remote Sensing, Guilford, New
York.
4. Thomas M., Lillesand, Ralph, Kefer, W., 1994. Remote Sensing and
Image Interpretation, John Wiley & Sons, New York.
5. Luder D., 1959. Aerial Photography Interpretation: Principles and
Application, McGraw Hill, New York,
6. Thomas M., Lillesand, Ralph, Kefer, W., 1994. Remote Sensing and
Image Interpretation, John Wiley & sons, New York.
7. photography/concepts-aerial-photography/9687
Web Sources
1. https://natural-resources.canada.ca/
2. https://en.wikipedia.org/
3. https://ibis.geog.ubc.ca/courses/geob373/lectures/Handouts/lecture05. pdf
4. https://www.nrcan.gc.ca/maps-tools-publications/satellite-imagery-air-
photos/air-photos/national-air-photo-library/about-aerial-

114 | P a g e
BLOCK 5

Fundamentals of Satellite Remote Sensing

Unit: 13 Types of Satellites: Geostationary and Sun-


synchronous Satellites

Unit: 14 Resolution: Spatial, Spectral, Radiometric and


Temporal

Unit: 15 Visual Image Interpretation

Unit: 16 Digital Image Classification

115 | P a g e
Unit 13
Types of Satellites: Geostationary and Sun-
synchronous Satellites
Structure
Overview
Learning Objectives

13.1 Types of Satellites


13.1.1 Geostationary Satellites
13.1.2 LEO (Low Earth Orbit)

13.1.3 MEO (Medium Earth Orbit)


13.1.4 HEO (High Earth Orbit)
13.1.5 Sun-synchronous Satellites
13. 2 Types of satellites based on the applications
13.2.1 Communication Satellites
13.2.2 Earth Observation Satellites
13.2.3 Navigation Satellites
13.2.4 Astronomical Satellites
Let us Sum Up
Glossary
Check Your Progress
Suggested Readings
Web Sources

Overview
A satellite or artificial satellite is an object intentionally placed into orbit in
outer space. Except for passive satellites, most satellites have an electricity
generation system for equipment on board, such as solar panels or
radioisotope thermoelectric generators (RTGs). Most satellites also have a
method of communication to ground stations, called transponders. Many
satellites use a standardized bus to save cost and work, the most popular
of which is small CubeSats. Similar satellites can work together as a group,

116 | P a g e
forming constellations. Because of the high launch cost to space, satellites
are designed to be as lightweight and robust as possible. Satellites are
placed from the surface to orbit by launch vehicles, high enough to avoid
orbital decay by the atmosphere. Satellites can then change or maintain the
orbit by propulsion, usually by chemical or ion thrusters. In 2018, about 90%
of satellites orbiting Earth are in low Earth orbit or geostationary orbit;
geostationary means the satellites stay still at the sky. Some imaging
satellites chose a Sun-synchronous orbit because they can scan the entire
globe with similar lighting. As the number of satellites and space debris
around Earth increases, the collision threat are becoming more severe. A
small number of satellites orbit other bodies (such as the Moon, Mars, and
the Sun) or many bodies at once (two for a halo orbit, three for a Lissajous
orbit).
Learning Objectives
After Learning this lesson, you will be able to:
• Know about Types of Satellites
• Geostationary and Sun-synchronous Satellites

13.1 Types of Satellites


Satellites have been put in space for various purposes and their placement
in space and orbiting shapes have been determined as per their specific
requirements.
Four different types of satellites orbits have been identified. These are:

Fig. 13.1 Satellite types

117 | P a g e
• GEO (Geostationary Earth Orbit) at about 36,000km above the
earth's surface.
• LEO (Low Earth Orbit) at about 500-1500km above the earth's
surface.
• MEO (Medium Earth Orbit) or ICO (Intermediate Circular Orbit) at
about 6000-20,000 km above the earth's surface.
• HEO (Highly Elliptical Orbit)
• Sun-synchronous orbit (SSO) satellites.
13.1.1 Geostationary Satellites
A geostationary orbit, also referred to as a geosynchronous equatorial orbit
(GEO), is a circular geosynchronous orbit 35,786 km (22,236 mi) in altitude
above Earth's Equator (42,164 km (26,199 mi) in radius from Earth's center
and following the direction of Earth's rotation. An object in such an orbit has
an orbital period equal to Earth's rotational period, one side real day, and
so to ground observers it appears motionless, in a fixed position in the sky.
The concept of a geostationary orbit was popularised by the science fiction
writer Arthur C. Clarke in the 1940s to revolutionise telecommunications,
and the first satellite to be placed in this kind of orbit was launched in 1963.
Communication satellites are often placed in a geostationary orbit so that
Earth-based satellite antennas (located on Earth) do not have to rotate to
track them but can be pointed permanently at the position in the sky where
the satellites are located. Weather satellites are also placed in this orbit for
real-time monitoring and data collection, and navigation satellites to provide
a known calibration point and enhance GPS accuracy. Geostationary
satellites are launched via a temporary orbit and placed in a slot above a
particular point on the Earth's surface. The orbit requires some station
keeping keeping its position, and modern retired satellites are placed in a
higher graveyard orbit to avoid collisions.
Advantages of GEO satellite
• Three Geostationary satellites are enough for complete coverage of
almost any spot-on earth.
• Receivers and senders can use fixed antenna positions, no
adjusting is needed.
• GEOs are ideal for TV and radio broadcasting.
• Lifetime expectations for GEOs are rather high, at about 15 years.

118 | P a g e
• Geostationary satellites have a 24-hour view of a particular area.
• GEOs typically do not need handover due to the large footprints.
• GEOs don't exhibit any Doppler shift because the relative movement
is zero.

Fig. 13.2 Geostationary Satellites


Disadvantages of GEO satellite
• Northern or southern regions of the earth have more problems
receiving these satellites due to the low elevation above latitude of
60 degrees, i.e. larger antennas are needed in this case.

• Shading of the signals in cities due to high buildings and the low
elevation further away from the equator limits transmission quality.
• The transmit power needed is relatively high (about 10 W) which
causes problems for battery powered devices.
• These satellites can't be used for small mobile phones.
• The biggest problem for voice and also data communication is high
latency of over 0.25s one way-retransmission schemes which are
known from fixed networks fail.
• Transferring a GEO into orbit is very expensive.

119 | P a g e
13.1.2 LEO (Low Earth Orbit)
As LEOs circulate on a lower orbit, it is obvious that they exhibit a much
shorter period (the typical duration of LEO periods is 95 to 120 minutes).
Additionally, LEO systems try to ensure a high elevation for every spot-on
earth to provide a high-quality communication link. Each LEO satellite will
only be visible from the earth for about ten minutes.
A further classification of LEOs into little LEOs with low bandwidth services
(some 100 bit/s), big LEOs (some 1,000 bit/s) and broadband LEOs with
plans reaching into the Mbits/s range can be found in Comparetto (1997).

LEO satellites are much closer to earth than GEO satellites, ranging from
500 to 1,500 km above the surface. LEO satellites do not stay in fixed
position relative to the surface and are only visible for 15 to 20 minutes each
pass.
Advantages of LEO satellite
• Using advanced compression schemes, transmission rates of about
2,400 bit/s can be enough for voice communication.
• LEOs even provide this bandwidth for mobile terminals with omni-
directional antennas using low transmit power in the range of 1 W.
• A LEO satellite’s smaller area of coverage is less of a waste of
bandwidth.
• Using advanced compression schemes, transmission rates of about
2,400 bit/s can be enough for voice communication.
• A LEO satellite's proximity to earth compared to a Geostationary
satellite gives it a better signal strength and less time delay, which
makes it better for point-to-point communication.
• Smaller footprints of LEOs allow for better frequency reuse, like the
concepts used for cellular networks.
Disadvantages of LEO satellite
• The biggest problem of the LEO concept is the need for many
satellites if global coverage is to be reached.
• The high number of satellites combined with the fast movement
results in a high complexity of the whole satellite system.
• The short time of visibility with a high elevation requires an additional
mechanism for connection handover between different satellites.
• One general problem of LEO is the short lifetime of about five to

120 | P a g e
eight years due to atmospheric drag and radiation from the inner Van
Allen belt.
• The low latency via a single LEO is only half of the story.
• Other factors are the need for routing of data packets from satellite
to satellite (or several times from base stations to satellites and
back) if a user wants to communicate around the world.

• A GEO typically does not need this type of routing, as senders and
receivers are most likely in the same footprints.
13.1.3 MEO (Medium Earth Orbit)
• A MEO satellite is situated in orbit somewhere between 6,000 km to
20,000 km above the earth's surface.
• MEO satellites are similar to LEO satellites in the context of
functionality.
• MEO satellites are similar to LEO satellite in functionality.
• Medium earth orbit satellites are visible for much longer periods of
time than LEO satellites, usually between 2 to 8 hours.
• MEO satellites have a larger coverage area than Low Earth Orbit
satellites.

• MEOs can be positioned somewhere between LEOs and GEOs,


both in terms of their orbit and due to their advantages and
disadvantages.
Advantages of MEO
• Using orbits around 10,000km, the system only requires a dozen
satellites which is more than a GEO system, but much less than a
LEO system.
• These satellites move more slowly relative to the earth's rotation
allowing a simpler system design (satellite periods are about six
hours).
• Depending on the inclination, a MEO can cover larger populations,
so requiring fewer handovers.

• A MEO satellite's longer duration of visibility and wider footprint


means fewer satellites are needed in a MEO network than a LEO
network.

121 | P a g e
Disadvantages of MEO
• Again, due to the larger distance to the earth, delay increases to
about 70-80 ms.

• The satellites need higher transmit power and special antennas for
smaller footprints.
• A MEO satellite's distance gives it a longer time delay and weaker
signal than LEO satellite.
13.1.4 HEO (High Earth Orbit)
• The High Earth orbit satellite is the only non-circular orbit of the four
types.
• HEO satellite operates with an elliptical orbit, with a maximum
altitude (apogee) like GEO, and a minimum altitude (perigee) like
LEO.
• The HEO satellites are used for special applications where coverage
of high latitude locations is required.
131.5 Sun-synchronous Satellites
A Sun-synchronous orbit (SSO), also called a Helio synchronous orbit, is a
nearly polar orbit around a planet, in which the satellite passes over any
given point of the planet's surface at the same local mean solar time. More
technically, it is an orbit arranged so that it processes through one complete
revolution each year, so it always maintains the same relationship with the
Sun. A Sun-synchronous orbit is useful for imaging, reconnaissance, and
weather satellites, because every time that the satellite is overhead, the
surface illumination angle on the planet underneath it is nearly the same.
This consistent lighting is a useful characteristic for satellites that image the
Earth's surface in visible or infrared wavelengths, such as weather and spy
satellites, and for other remote-sensing satellites, such as those carrying
ocean and atmospheric remote-sensing instruments that require sunlight.
For example, a satellite in Sun-synchronous orbit might ascend across the
equator twelve times a day, each time at approximately 15:00 mean local
time.

122 | P a g e
Fig. 13.3 Geostationary and Sun-synchronous Orbit
13. 2 Types of satellites based on the applications
Providing communication and television services is only the tip of the
iceberg when it comes to the use of space-based technology. Many types
of satellites have been launched in recent years for a wide variety of
scientific purposes, including Earth observation, meteorological study,
navigation, studying the effects of space flight on living organisms, and
gaining insight into the cosmos. Today, the most common four types of
satellites based on their application are:
• communication.
• Earth observation.

• navigation.
• astronomical.
Our in-depth examination of the characteristics of different types of satellites
and their functions continues below.
13.2.1 Communication Satellites
A communication spacecraft, usually located at GEO and equipped with a
transponder — an integrated receiver and transmitter of radio signals —
may receive signals from Earth and retransmit them back to the planet. As
a result, it opens interaction channels between regions that were previously

123 | P a g e
unable to communicate with one another due to large distances or other
obstacles. Different types of communication satellites facilitate various
forms of media transmissions, such as radio, TV, telephone, and the
Internet.
Using the communication type of spacecraft, you can relay many signals at
once. Spacecraft for broadcasting and TV signal distribution to ground-
based stations typically have individual transponders for each carrier. In
most cases, though, several carriers will be relayed by a single transponder.
Due to its compatibility with mobile terminals, this type of satellites is ideally
suited for long-distance communication.
13.2.2 Earth Observation Satellites
The purpose of Earth observation type of satellites is to monitor our planet
from space and report back on any changes they observe. This type of
space technology makes possible consistent and repeatable environmental
monitoring as well as rapid analysis of events during emergencies like
natural disasters and armed conflicts.
The goals of the surveillance mission determine the type of satellite sensors
used for Earth observation. Information collected varies depending on the
type of sensor employed and the available frequency bands.
Our first EOS SAT constellation satellite, EOS SAT-1, is now orbiting the
Earth on the mission to improve farming and forest management through
precision technology. Eleven spectral bands of the EOS SAT-1 are
specifically designed to monitor diverse agriculture and forestry aspects,
from the presence of crop diseases to soil moisture.

It is possible to categorize Earth observation spacecraft into the following


types:
• Weather satellites are employed for monitoring and forecasting
weather trends and providing actual weather data. GEO is ideal for
different types of weather satellites, as it provides a constant
viewpoint that enables scientists to track cloud patterns and predict
their movements.
• Remote sensing satellites’ primary applications are all types of
environmental monitoring and geographical mapping. Machines
used for different types of remote sensing circle the Earth in one of
three orbits: polar, non-polar LEO, or GEO. Geographical
information system (GIS) satellites are a type of remote sensing
spacecraft whose main function is to provide images appropriate
for GIS mapping and further spatial analysis.

124 | P a g e
13.2.3 Navigation Satellites
The navigation system constellations are located between 20,000 and
37,000 kilometers from Earth’s surface. This type of satellite sends out
signals that reveal their time, position in space, and health status. There are
two major types of space navigation systems:
• The spacecraft of the Global Navigation Satellite System
(GNSS) broadcast signals that GNSS receivers pick up and utilize
for geolocation purposes, providing global coverage. Galileo in
Europe, GPS in the United States, and the BeiDou Navigation
Satellite System in China are all examples of GNSS.
• The Regional Navigation Satellite System (RNSS) is an
autonomous regional navigation system that provides coverage on
a regional scale. For instance, India’s IRNSS project aims to provide
Indian citizens with a reliable location-based service.
13.2.4 Astronomical Satellites
Basically, an astronomical satellite is a giant telescope in orbit. It can see
well without interference from the Earth’s atmosphere, and its infrared
imaging technology can function normally without being fooled by the
planet’s surface temperature. The satellite type used for astronomy has a
vision that is up to ten times better than the most powerful telescope on
Earth.
Spacecraft used in astronomy can be broken down into several distinct
types:
• Astronomy satellites are used to investigate different types of
celestial bodies and phenomena in space, from the creation of stars
and planetary surface maps and taking images of the planets in our
solar system to the study of black holes.
• The use of climate research satellites fitted with specific types of
sensors allows scientists to gather comprehensive, multi-faceted
data on the world’s oceans and ice, land, biosphere, and
atmosphere.
• Space-based studies on plants and animal cells and structures are
possible thanks to biosatellites. Because they allow scientists from
different regions to work together, this type of spacecraft plays a
crucial role in the progress of medicine and biology.
Most satellites can perform more than one function simultaneously. Still, it’s
a common recommendation that researchers diversify the types of satellites

125 | P a g e
they use to obtain more comprehensive and accurate results of their
studies. EOSDA Land Viewer is a helpful tool for this because it aggregates
space-collected imagery (including high-resolution) from multiple sources
and provides a user-friendly interface for finding and downloading the
images you need.

Let us Sum Up
Satellites are placed from the surface to orbit by launch vehicles, high
enough to avoid orbital decay by the atmosphere. Satellites can then
change or maintain the orbit by propulsion, usually by chemical or ion
thrusters. In 2018, about 90% of satellites orbiting Earth are in low Earth
orbit or geostationary orbit; geostationary means the satellites stay still at
the sky. Some imaging satellites chose a Sun-synchronous orbit because
they can scan the entire globe with similar lighting. As the number of
satellites and space debris around Earth increases, the collision threat is
becoming more severe.

Glossary
Geostationary orbit: A geostationary orbit, also referred to as a
geosynchronous equatorial orbit (GEO), is a circular geosynchronous orbit
35,786 km (22,236 mi) in altitude above Earth's Equator (42,164 km (26,199
mi) in radius from Earth's center) and following the direction of Earth's
rotation.
Sun-synchronous orbit: A Sun-synchronous orbit (SSO), also called a
heliosynchronous orbit, is a nearly polar orbit around a planet, in which the
satellite passes over any given point of the planet's surface at the same
local mean solar time. More technically, it is an orbit arranged so that it
processes through one complete revolution each year, so it always
maintains the same relationship with the Sun.

Check Your Progress


1. Define Geostationary Orbit
It is a circular geosynchronous orbit 35,786 km in altitude above
Earth's Equator 42,164 km in radius from Earth's center.
2. Define Sun-synchronous Orbit.
It is a nearly polar orbit around a planet in which the satellite passes
over any given point of the planet's surface at the same local mean solar
time.

126 | P a g e
Suggested Readings
1. Campbell J., 1989. Introduction to Remote Sensing, Guilford, New
York.
2. Thomas M., Lillesand, Ralph, Kefer, W., 1994. Remote Sensing and
Image Interpretation, John Wiley & Sons, New York.
3. Campbell J., 1989. Introduction to Remote Sensing, Guilford, New
York.

4. Thomas M., Lillesand, Ralph, Kefer, W., 1994. Remote Sensing and
Image Interpretation, John Wiley & Sons, New York.
5. Luder D., 1959. Aerial Photography Interpretation: Principles and
Application, McGraw Hill, New York,
6. Thomas M., Lillesand, Ralph, Kefer, W., 1994. Remote Sensing and
Image Interpretation, John Wiley & sons, New York.
7. photography/concepts-aerial-photography/9687
Web Sources
1. https://eos.com/blog/types-of-satellites/
2. https://www.javatpoint.com/types-of-satellite-systems

127 | P a g e
Unit 14
Resolution: Spatial, Spectral, Radiometric
and Temporal
Structure
Overview
Learning Objectives

14.1 Resolution
14.1.1 Spatial Resolution
14.1.2. Spectral Resolution

14.1.3. Radiometric Resolution


14.1.4. Temporal Resolution
14.1.5. Geometric Resolution
14.2 Factors Affecting Remote Sensing Resolutions
14.2.1. Platforms and Sensors:
14.2.2. Environmental Conditions:
14.2.3. Data Processing and Analysis:
14.2.4. User Requirements and Applications:
Let us Sum Up
Glossary
Check Your Progress
Suggested Readings
Web Sources

Overview
The resolution of satellite images varies depending on the instrument used
and the altitude of the satellite's orbit. For example, the Landsat archive
offers repeated imagery at 30-meter resolution for the planet, but most of it
has not been processed from the raw data. Landsat 7 has an average return
period of 16 days. For many smaller areas, images with resolution as fine
as 41 cm can be available. Satellite imagery is sometimes supplemented
with aerial photography, which has higher resolution, but is more expensive

128 | P a g e
per square meter. Satellite imagery can be combined with vector or raster
data in a GIS provided that the imagery has been spatially rectified so that
it will properly align with other data sets.
Learning Objectives
After Learning this lesson, you will be able to:
• Know about resolutions of satellite Images.
• Know about types of resolutions.

14.1 Resolution
The term resolution is used in remote sensing to describe the resolving
power, which includes the ability to identify not just the presence of two
objects, but also their characteristics. The degree of detail shown in an
image is referred to as resolution in qualitative terms. As a result, an image
with finer details is referred to have finer resolution than with coarser details.
Satellites and other airborne platforms are particularly useful for collecting
information on a regional scale. Different satellites provide data of different
types and quality. There are four basic categories of resolutions used to
create remotely sensed images. Each Remote sensing system has four
major resolutions associated with it. These resolutions should be
understood by the scientist to extract meaningful bio-physical or hybrid
information from the remotely sensed imagery.
• Spectral
• Spatial
• Temporal
• Radiometric
• Geometric
14.1.1 Spatial Resolution
The smallest angular separation between two objects is measured by
spatial resolution. This is expressed in pixels for satellite images, and the
spatial resolution of a specific image is expressed as the number of meters
represented by each pixel. For example: The multispectral scanner for the
satellite Spot satellite 4 has a spatial resolution of 20 meters. This means
that each individual square pixel represents a 400-square-meter spatial
area. When many photos are combined and the pixel sizes are averaged to
represent a greater region, there are instances when pixel size and
resolution are not the same.

129 | P a g e
Fig. 14.1 Spatial Resolution
14.1.2. Spectral Resolution
The quantity and size of bands in the electromagnetic spectrum that a
remote sensing platform can record is referred to as spectral resolution. For
example, the first two Land sat satellites were used as the multi- spectral
scanner (MSS) to take images in four different spectral bands (green, red,
and two near-infrared bands). Hyperspectral platforms can collect hundreds
of bands across the electromagnetic spectrum (e.g., Hyperion). Spectral
resolution refers to the wavelength intervals to which the sensor can detect.
Sensors which can discriminate fine spectral differences are said to have a
high spectral resolution. In other words, this refers to the number and
dimension of specific wavelength in travel in the electromagnetic spectrum
to which a remote sensing instrument is sensitive or it is the sensing and re
coding power of the sensor in different bands of EMR. It is the ability of the
sensor to distinguish finer variation of the reflected variation from different
objects.

Fig. 14.2 Spectral Resolution

130 | P a g e
14.1.3. Radiometric Resolution
Radiometric resolution refers to how much information is contained in a pixel
and is measured in bits that range from 0 to a chosen power of two. Each
bit had a 2-exponent power, therefore 1 bit = 21= 2 and an 8-bit image =
28= 256 =0-255. The sensitivity of a remote sensing platform to detect slight
variations in energy, specifically radiant flux, is known as radiometric
resolution (radiant energy emitted per unit time). A passive or active sensor
is usually used in remote sensing platforms. Passive sensors record
electromagnetic radiation reflected from the earth's surface. Active sensors
use machine- made electromagnetic radiation to coat the earth's surface
and measure the amount of radiant flux that returns to the sensor.

Fig.14.3 - 2 Bit image 4 Bit image 8 Bit Radiometric resolution


14.1.4. Temporal Resolution
In addition to spatial, spectral, and radiometric resolution, the concept of
temporal resolution is also important to consider in a remote sensing
system. The revisit period of a satellite sensor is usually several days.
Therefore, the absolute temporal resolution of a remote sensing system to
image the exact same area at the same viewing angle a second time is
equal to this period. However, because of some degree of overlap in the
imaging swaths of adjacent orbits for most satellites and the increase in this
overlap with increasing latitude, some areas of the Earth tend to be re-
imaged more frequently. Also, some satellite systems can point their
sensors to image the same area between different satellite passes
separated by periods from one to five days.

131 | P a g e
Fig. 14. 4 Temporal Resolution
14.1.5. Geometric Resolution
Geometric resolution refers to the satellite sensor's ability to effectively
image a portion of the Earth's surface in a single pixel and is typically
expressed in terms of Ground sample distance, or GSD. GSD is a term
containing the overall optical and systemic noise sources and is useful for
comparing how well one sensor can "see" an object on the ground within a
single pixel. For example, the GSD of Landsat is ≈30m, which means the
smallest unit that maps to a single pixel within an image is ≈30m x 30m. The
latest commercial satellite (GeoEye 1) has a GSD of 0.41 m. This compares
to a 0.3 m resolution obtained by some early military film-
based Reconnaissance satellite.

14.2 Factors Affecting Remote Sensing Resolutions


Remote sensing is a technology that allows us to obtain information about
the Earth’s surface without physically being there. It involves the use of
platforms, sensors, and data processing techniques to capture and analyze
data. However, the resolution of remote sensing is affected by several
factors, which are discussed below:
14.2.1. Platforms and Sensors:
Remote sensing platforms and sensors play a crucial role in determining the
resolution of remote sensing images. Different sensors and platforms have
varying capabilities, depending on their design, specifications, and mission
objectives.

132 | P a g e
For instance, satellites and airplanes are commonly used platforms for
remote sensing, and their sensors have varying spatial, spectral, and
temporal resolutions. High-resolution sensors can capture detailed images
of small areas, while low-resolution sensors capture broader areas with less
detail. The choice of the platform and sensor depends on the specific
application and the level of detail required.
14.2.2. Environmental Conditions:
Environmental conditions such as cloud cover, atmospheric conditions, and
topography can affect the resolution of remote sensing images. For
example, clouds can obscure the surface features, affecting the visibility of
the ground.
Atmospheric conditions such as haze, dust, and smoke can also affect the
accuracy of remote sensing images by scattering or absorbing radiation.
Additionally, the terrain of the area being observed can also affect the
resolution, as it can cause shadows and distortions in the images.
14.2.3. Data Processing and Analysis:
The processing and analysis of remote sensing data can also affect the
resolution of the resulting images. The level of preprocessing and calibration
of the data can significantly impact the accuracy and detail of the images.
Additionally, image enhancement techniques such as filtering, contrast
stretching, and sharpening can improve the visual quality of the images but
may also introduce artifacts and reduce the spatial resolution.
14.2.4. User Requirements and Applications:
Finally, the resolution of remote sensing images can also be influenced by
the user’s requirements and the intended application. Different applications
may require different levels of resolution, depending on the objectives of the
analysis.
For example, mapping applications may require high-resolution images to
identify and map small features accurately. Conversely, broad-scale
analyses such as land cover mapping may require lower resolution images
to cover larger areas efficiently.

Let us Sum Up
Remote Sensing is the science and art of acquiring information (spatial,
spectral, radiometric, and temporal) about material objects, area, or
phenomenon, without coming into physical contact with the objects, or area,

133 | P a g e
or phenomenon under investigation. Remotely sensed data are collected
using either passive or active remote sensing systems. Passive sensors
collect data using EMR while active sensor creates EMR, and the reflected
energy is stored. Orbital period is the amount of time it takes a satellite to
complete one rotation in its orbit around the earth. The width of the area on
the Earth's surface imaged by the sensor during a single pass is referred to
as the swath of a satellite. Radiometric resolution refers to how much
information is contained in a pixel and is measured in bits that range from 0
to a chosen power of two. Each bit had a 2-exponent power, therefore 1 bit
= 21= 2 and an 8-bit image = 28= 256 =0-255.

Glossary
Hyperion is an unusually shaped satellite of Saturn. It has an average radius
of 135 km.

Equatorial plane: A plane passing through the equator of the Earth or


another celestial body, perpendicular to its axis of rotation and equidistant
from its poles.

Check Your Progress


1. What is resolution?
The term resolution is used in remote sensing to describe the resolving
power, which includes the ability to identify not just the presence of two
objects, but also their characteristics.
2. What are the types of resolutions?
Spatial, Spectral, Temporal and Radiometric resolution.

3. Write about orbital inclination?


The orbital plane's inclination is measured clockwise from the equator.
A remote sensing satellite's orbital inclination is normally 99 degrees.
On the equatorial plane, any satellite's inclination is about 180 degrees.

Suggested Readings
1. Campbell J., 1989. Introduction to Remote Sensing, Guilford, New
York.
2. Thomas M., Lillesand, Ralph, Kefer, W., 1994. Remote Sensing and
Image Interpretation, John Wiley & Sons, New York.
3. Campbell J., 1989. Introduction to Remote Sensing, Guilford, New
York.
4. Thomas M., Lillesand, Ralph, Kefer, W., 1994. Remote Sensing and

134 | P a g e
Image Interpretation, John Wiley & Sons, New York.
5. Luder D., 1959. Aerial Photography Interpretation: Principles and
Application, McGraw Hill, New York,
6. Thomas M., Lillesand, Ralph, Kefer, W., 1994. Remote Sensing and
Image Interpretation, John Wiley & sons, New York.
7. photography/concepts-aerial-photography/9687

Web Sources
1. https://www.nrcan.gc.ca/maps-tools-and-publications/satellite-imagery-
and-photos/tutorial-fundamentals-remote-sensing/satellites-and-
sensors/satellite-characteristics-orbits-and swaths/9283

2. https://www.spatialpost.com/types-of-resolution-in-remote-sensing/

135 | P a g e
Unit 15
Visual Image Interpretation
Structure
Overview
Learning Objectives
15.1 Visual Image Interpretation

15.2 Elements of Visual Image Interpretation


15.2.1 Tone or color:
15.2.2 Size

15.2.3 Shape
15.2.4 Texture
15.2.5 Pattern
15.2.6 Shadow:
15.2.7 Site
15.2.8 Association
15.3 Interpretation Key
15.4 Field Verification
Let us Sum Up
Glossary
Check Your Progress
Suggested Readings

Web Sources

Overview
Visual image interpretation is a first analysis approach to remote sensing
imagery. Here, the size, shape, and position of objects as well as the
contrast and colour saturation are analysed. Analysis of remote sensing
imagery involves the identification of various targets in an image, and those
targets may be environmental or artificial features which consist of points,
lines, or areas. Targets may be defined in terms of the way they reflect or

136 | P a g e
emit radiation. This radiation is measured and recorded by a sensor, and
ultimately is depicted as an image product such as an air photo or a satellite
image. Observing the differences between targets and their backgrounds
involves comparing different targets based on any, or all, of the visual
elements of tone, shape, size, pattern, texture, shadow, and association.
Learning Objectives
After Learning this lesson, you will be able to:
• Visual Image Interpretation
• Elements of Visual Image Interpretation

• Interpretation Keys
• Visual Interpretation using colour composite.

15.1 Visual Image Interpretation


Visual image interpretation is the process of an analyst / interpreter
recognizing elements on images and communicating information gathered
from these images to others for the purpose of evaluating their importance.
The information generated from the interpretation process may be more
authentic and reliable if the interpreter has a visual and photographic sense.
Visual interpretation involves visual analysis of aerial photographs
and satellite images. When the interpretation is carried out with the help of
a computer. Visual perception is the ability to interpret information and
surroundings from the effects of visible light reaching the eye. Interpretation
is the process of extraction of qualitative and quantitative information of
objects from aerial photographs or satellite images.

15.2 Elements of Visual Image Interpretation


Interpretation of aerial photographs and images are different because of
three important aspects:
(1) The portrayal of features from an overhead, often unfamiliar,
perspective.
(2) The frequent use of wavelengths outside of the visible portion of the
spectrum; and
(3) The depiction of the earth’s surface at unfamiliar scales and. eight
fundamental parameters or elements are used in the interpretation
of remote sensing images or photographs. These are tone or
colour, texture, size, shape, pattern, shadow, site and association.
In some cases, a single such element is alone sufficient for

137 | P a g e
successful identification; in others, the use of several elements will
be required.

Fig. 15.1. Ordering of image elements in image interpretation.

15.2.1 Tone or colour


Tone is the relative brightness of grey level on black and white image or
color/F.C.C image. Tone is the measure of the intensity of the reflected or
emitted radiation of the objects of the terrain. Lower reflected objects appear
relatively dark and higher reflected objects appear bright. Figure 15.2a
represents a band imaged in NIR region of the electromagnetic spectrum.
Rivers do not reflect in NIR region thus appear black and the vegetation
reflects much thus appears bright. Our eyes can discriminate only 16-20
grey levels in the black and white photograph, while more than hundreds of
colors can be distinguished in a color photograph. In multispectral imaging,
optimal three bands are used to generate color composite image. False
Color Composite (FCC) using NIR, red and green are the most preferred
combination for visual interpretation. In a standard FCC, NIR band passes
through red channel, red band passes through green channel and green
band passes through blue channel. Vegetation reflects much in the NIR
region of the electromagnetic spectrum therefore in standard FCC
vegetation appears red (Fig. 15.2b), which is more suitable in vegetation
identification.

138 | P a g e
Fig. 15.2. Satellite image of area in (a) grey scale and in (b) standard FCC.

15.2.2 Size
The size of objects can be important in discrimination of objects and
features (single family vs. multi-family residences, scrubs vs. trees, etc.). In
the use of size as a diagnostic characteristic both the relative and absolute
sizes of objects can be important. Size can also be 5 used in judging the
significance of objects and features (size of trees related to board feet which
may be cut; size of agricultural fields related to water use in arid areas, or
amount of fertilizers used; size of runways gives an indication of the types
of aircraft that can be accommodated) as shown in figure 3. It is important
to assess the size of a target relative to other objects in a scene, as well as
the absolute size, to aid in the interpretation of that target.

Fig. 15.3. Size

139 | P a g e
15.2.3 Shape
Shape refers to the general form, configuration, or outline of an individual
object. Shape is one of the most important single factors for recognizing
object from an image. Generally regular shapes, squares, rectangles,
circles are signs of man-made objects, e.g., buildings, roads, and cultivated
fields, whereas irregular shapes, with no distinct geometrical pattern are
signs of a natural environment, e.g., a river, forest. In a general case of
misinterpretation in between roads and train line: roads can have sharp
turns, joints perpendicularly, but rails line does not. From the shape of the
following image, it can be easily said that the dark blue colored object is a
river.

Fig. 15.4. Shape


15.2.4 Texture
Texture refers to the frequency of tonal variation in an image. Texture is
produced by an aggregate unit of features which may be too small to be
clearly discerned individually on the image. It depends on shape, size,
pattern and shadow of terrain features. Texture is always scale or resolution
dependent. Same reflected objects may have difference in texture helps in
their identification. As an example in a high resolution image grassland and
tree crowns have similar tone, but grassland will have smooth texture
compared to tree. Smooth texture refers to less tonal variation and rough
texture refers to abrupt tonal variation in an imagery or photograph.

140 | P a g e
Fig. 15.5: Textural variations
15.2.5 Pattern
Patterns are the spatial arrangement of objects. Pattern can be either man-
made or natural. Pattern is a macro image characteristic. It is the regular
arrangement of objects that can be diagnostic of features on the landscape.
Arrangements of complex drainage in the form of ravines can be identified
easily. Likewise, the network or grid of streets in a sub- division or urban
area can aid identification and aid in problem solving such as the growth
patterns of a city. Patterns can also be very important in geological or
geomorphological analysis. Drainage pattern can tell the trained observer a
great deal about the lithology and structural patterns in an area (figure 6).
Dendritic drainage patterns develop on flat bedded sediments; radial
on/over domes; linear or trellis in areas with faults or other structural
controls.

Fig. 15.6. Pattern

141 | P a g e
15.2.6 Shadow:
It is useful in interpretation as it may provide an idea of the profile and
relative height of a target or targets which may make identification easier.
However, shadows can also reduce or eliminate interpretation in their area
of influence, since targets within shadows are much less (or not at all)
discernible from their surroundings. Shadow is also useful for enhancing or
identifying topography and landforms.

Fig. 15.7. Shadow


15.2.7 Site
Aspects, topography, geology, soil, vegetation, and cultural features such
as salt pans, settlements, industrial establishments etc. on the landscape
are distinctive factors that the interpreter should use when examining a site.
The relative importance of each of these factors will vary with local
conditions, but all 8 are important. Just as some vegetation grows in
swamps others grow on sandy ridges. Agricultural crops may like certain
conditions. Manmade features may also be found on rivers (e.g., power
plant) or on a hilltop (observatory or radar facility).
15.2.8 Association
It considers the relationship between other recognizable objects or features
in proximity to the target of interest. The identification of features that one
would expect to associate with other features may provide information to
facilitate identification. Some objects are so commonly associated with one

142 | P a g e
another that identification of one tends to indicate or confirm the existence
of another. Smokestacks, step buildings, cooling ponds, transformer yards,
coal piles, railroad tracks, coal fired power plant. Arid terrain, basin bottom
location, highly reflective surface, sparse vegetation, play a water body
surrounded by salt pond and saline patches, salt production units.
Association is one of the most helpful clues in identifying manmade
installations. Aluminium manufacture requires large amounts of electrical
energy. Absence of a power supply may rule out this industry. Cement
plants have rotary kilns. Schools at different levels typically have
characteristic playing fields, parking lots and clusters of buildings in urban
areas.

Fig. 15.7. Salt Pan

15.3 Interpretation Key


An interpretation key is a set of guidelines used to assist interpreters in
rapidly identifying features. Determination of the type of key and the method
of presentation to be employed will depend upon,
• The number of objects or conditions to be identified; and,

• The variability typically encountered within each class of features or


objects within the key.
Some authors say that generally, keys are more easily constructed and
used for the identification of man-made objects and features than for natural
vegetation and landforms. For analysis of natural features, training and field
experience are often essential to achieve consistent results. Basically, an
interpretation key helps the interpreter organize the information present in
image form and guides him/her to the correct identification of unknown
objects. Keys can be used in conjunction with any type of remotely sensed
data. Such keys can differ from those employed in other disciplines in that

143 | P a g e
they can consist largely of illustrations, e.g., landforms, industrial facilities,
military installations. Many types of keys are already available if you can
find or get your hands on them. This can often be very difficult and a reason
why people develop their own keys. Depending upon the way the diagnostic
features are organized, two types of keys are generally recognized. 1)
Selective keys and 2) Elimination keys. Selective keys are arranged in such
a way that an interpreter simply selects that example that most closely
corresponds to the object they are trying to identify, e.g., industries,
landforms etc. Elimination Keys are arranged so that the interpreter follows
a precise stepwise process that leads to the elimination of all items except
the one(s) that they are trying to identify. Dichotomous keys are essentially
a class of elimination key. Most interpreters prefer to use elimination keys
in their analyses (Colwell, 1997; Olson, 1960).

15.4 Field Verification


Field verification can be considered as a form of collateral material because
it is typically conducted to assist in the analysis process. Essentially, this is
the process of familiarizing the interpreter with the area or type of feature.
This type of verification is done prior to the interpretation to develop a visual
"signature" of how the feature(s) of interest appear on the ground. After an
interpretation is made field verification can be conducted to verify accuracy.
The nature, amount, timing, method of acquisition, and data integration
procedures should be carefully thought out.

Let us Sum Up
The information generated from the interpretation process may be more
authentic and reliable. The relative brightness or colour of features in an
image is referred to as tone. The spatial arrangement of objects is known
as pattern. The recognizable pattern is formed by orderly repetition of similar
tones and textures. The targets are identified easier using shadow. It will
provide relative height and an idea of the targets.

Glossary
Association: Association refers to the occurrence of certain features in
relation to other objects in the imagery. In urban area a smooth vegetation
pattern generally refers to a playground or grass land not agricultural land,

Site: Site refers to a topographic or geographic location. It is also an


important element in image interpretation when objects are not clearly
identified using the previous elements. A very high reflectance feature in the
Himalayan valley may be snow or clouds, but in Kerala one cannot say it as
snow.

144 | P a g e
Interpretation Key: The criterion for identification of an object with
interpretation elements is called an interpretation key.

Check Your Progress


1. What is visual image interpretation?

Visual image interpretation is the process of an analyst / interpreter


recognizing elements on images and communicating information
gathered from these images to others for the purpose of evaluating their
importance.
2. How are the textures classified?
Textures are classified as rough and smooth. Even surfaces such as
grass land, fields will come under smooth textures and irregular
structure with rough surface such as forest canopy results in rough
textures.
3. Define shape.
Shape refers to the structure, or outline of individual objects. Shape is a
unique clue for interpretation. Normally urban or agricultural targets are
represented by straight edge shapes whereas natural features including
forests are mostly irregular in shapes.

Suggested Readings
1. Campbell J., 1989. Introduction to Remote Sensing, Guilford, New
York.
2. Thomas M., Lillesand, Ralph, Kefer, W., 1994. Remote Sensing and
Image Interpretation, John Wiley & Sons, New York.

3. Campbell J., 1989. Introduction to Remote Sensing, Guilford, New


York.
4. Thomas M., Lillesand, Ralph, Kefer, W., 1994. Remote Sensing and
Image Interpretation, John Wiley & Sons, New York.
5. photography/concepts-aerial-photography/9687

Web Sources
1. http://ecoursesonline.iasri.res.in/

2. https://natural-resources.canada.ca/
3. https://www.dspmuranchi.ac.in/pdf/Blog/Fundamental%20of%20Visua
l%20Image%20Interpretation%20&%20Its%20Keys.pdf

145 | P a g e
Unit 16
Digital Image Classification
Structure
Overview
Learning Objectives
16.1 Digital Image Classification

16.2 Supervised Classification


16.2.1 Steps involved in Supervised Classification
16.2.2 Advantages and Disadvantages of Supervised Classification

16.3 Unsupervised Classification


16.3.1 K-Means Clustering
16.3.2 ISODATA Clustering
Let us Sum Up
Glossary
Check Your Progress
Suggested Readings
Web Sources

Overview
Image classification is the process of categorizing and labelling groups of
pixels or vectors within an image into one of several predefined classes.
Image classification is a major part of digital image analysis. Classification
between objects is a difficult task and therefore image classification has
been an important task within the field of computer vision. The major steps
for image classification include training, classifying and then recognising the
pattern of the image. To classify the image, we must first understand the
relationship between data and the classes. There are three commonly used
methods for image classification they are supervised, unsupervised and
object-based image analysis.
Unsupervised classification is essentially the inverse of supervised
classification. The analyst initially groups spectral classes based on
numerical information in the data, and then matches them to information

146 | P a g e
classes. Clustering algorithms are programmes that are used to determine
the natural (statistical) groupings or structures in data. Typically, the analyst
sets the number of groups or clusters to look for the data. In addition to the
number of classes, the analyst may additionally define parameters relating
to the separation distance between clusters. As a result, unsupervised
classification is not completely without human assistance. However, unlike
supervised categorization, it does not begin with a pre-determined set of
classifications.
Learning Objectives
After Learning this lesson, you will be able to:
• Supervised Classification
• Basic steps to apply Supervised Classification

• Unsupervised Classification

16.1 Digital Image Classification


Digital image classification uses the quantitative spectral information
contained in an image, which is related to the composition or condition of
the target surface. Image analysis can be performed on multispectral as well
as hyperspectral imagery. It requires an understanding of the way materials
and objects of interest on the earth's surface absorb, reflect, and emit
radiation in the visible, near infrared, and thermal portions of the
electromagnetic spectrum. In order to make use of image analysis results
in a GIS environment, the source image should be orthorectified so that the
final image analysis product, whatever its format, can be overlaid with other
imagery, terrain data, and other geographic data layers. Classification
results are initially in raster format, but they may be generalized to polygons
with further processing. There are several core principles of image analysis
that pertain specifically to the extraction of information and features from
remotely sensed data.

• Spectral differentiation is based on the principle that objects of different


composition or condition appear as different colours in a multispectral or
hyperspectral image. For example, a newly planted cornfield has a
distinct colour when compared to a field of mature plants, and yet
another colour when the field has been harvested. Corn has a distinct
colour as compared to wheat; healthy plants are a different colour than
pest-infested or drought-impacted plants. The use of spectral signature,
or colour, to distinguish types of ground cover or objects is called
spectral differentiation.

147 | P a g e
• Radiometric differentiation is the detection of differences in brightness,
which may in certain cases be used to inform the image analyst as to
the nature or condition of the remotely sensed object.
• Spatial differentiation is related to the concept of spatial resolution. We
may be able to analyze the spectral content of a particular pixel or group
of pixels in a digital image when those pixels comprise a single
homogeneous material or object. It is also important to understand the
potential for mixing of the spectral signatures of multiple objects into the
recorded spectral values for a single pixel. When designing an image
analysis task, it is important to consider the size of the objects to be
discovered or studied compared to the ground sample distance of the
sensor.

Fig. 16.1 Image Classification

148 | P a g e
16.2 Supervised Classification
Supervised classification is based on the idea that a user can select sample
pixels in an image that are representative of specific classes and then direct
the image processing software to use these training sites as references for
the classification of all other pixels in the image. Training sites are selected
based on the knowledge of the user. The user also sets the bounds for how
similar other pixels must be to group them together. These bounds are often
set based on the spectral characteristics of the training area, plus or minus
a certain increment. The user also designates the number of classes that
the image is classified into. Many analysts use a combination of supervised
and unsupervised classification processes to develop final output analysis
and classified maps.
16.2.1 Steps involved in Supervised Classification
A supervised classification algorithm requires a training sample for each
class, that is, a collection of data points known to have come from the class
of interest. The classification is thus based on how “close” a point to be
classified is to each training sample. We shall not attempt to define the word
“close” other than to say that both Geometric and statistical distance
measures are used in practical pattern recognition algorithms. The training
samples are representative of the known classes of interest to the analyst.
Classification methods that rely on the use of training patterns are called
supervised classification methods. In supervised classification, you select
representative samples for each land cover class. The software then uses
these “training sites” and applies them to the entire image. The three basic
steps involved in a typical supervised classification procedure are as
follows:

149 | P a g e
(i) Training stage: The analyst identifies representative training areas
and develops numerical descriptions of the spectral signatures of
each land cover type of interest in the scene. Training sites are areas
that are known to be representative of a particular land cover type.
The computer determines the spectral signature of the pixels within
each training area, and uses this information to define the statistics,
including the mean and variance of each of the classes. Preferably
the location of the training sites should be based on field collected
data or high-resolution reference imagery. It is important to choose
training sites that cover the full range of variability within each class to
allow the software to accurately classify the rest of the image. If the
training areas are not representative of the range of variability found
within a particular land cover type, the classification may be much less
accurate. Multiple, small training sites should be selected for each
class. The more time and effort spent in collecting and selecting
training sites the better the classification results.
(ii) The classification stage (Decision Rule) or Generate signature file:
Each pixel in the image data set IS categorized into the land cover
class it most closely resembles. If the pixel is insufficient like any
training data set it is usually labeled ‘Unknown’.
(iii) The output stage or Classify: The results may be used in several
different ways. Three typical forms of output products are thematic
maps, tables and digital data files which become input data for GIS.
The output of image classification becomes input for GIS for spatial
analysis of the terrain. Fig. 2 depicts the flow of operations to be
performed during image classification of remotely sensed data of an
area which ultimately leads to creating database as an input for GIS.
Plate 6 shows the land use/ land cover color coded image, which is
an output of image.
16.2.2 Advantages and Disadvantages of Supervised
Classification:
In supervised classification most of the effort is done prior to the actual
classification process. Once the classification is run the output is a thematic
image with classes that are labeled and correspond to information classes
or land cover types. Supervised classification can be much more accurate
than unsupervised classification, but depends heavily on the training sites,
the skill of the individual processing the image, and the spectral distinctness
of the classes. If two or more classes are very similar to each other in terms

150 | P a g e
of their spectral reflectance (e.g., annual-dominated grasslands vs.
perennial grasslands), misclassifications will tend to be high. Supervised
classification requires close attention to the development of training data. If
the training data is poor or not representative the classification results will
also be poor. Therefore, supervised classification generally requires more
time and money compared to unsupervised.

16.3 Unsupervised Classification


As the name implies, this form of classification is done without interpretive
guidance from an analyst. An algorithm automatically organises similar pixel
values into groups that become the basis for different classes. This is
entirely based on the statistics of the image data distribution and is often
called clustering. The process is automatically optimised according to
cluster statistics without the use of any knowledge-based control (i.e.
ground referenced data). The method is, therefore, objective and entirely
data driven. It is particularly suited to images of targets or areas where there
is no ground knowledge. Even for a well-mapped area, unsupervised
classification may reveal some spectral features which were not apparent
beforehand. The basic steps of unsupervised classification are shown
below.

• Algorithm clusters data


• Find inherent classes.
• Pixel classification is based on clusters.
• Spectral class map
• Analyst labels clusters (may involve grouping of clusters)
• Informational class map

The result of an unsupervised classification is an image of statistical


clusters, where the classified image still needs interpretation based on
knowledge of thematic contents of the clusters. There are hundreds of
clustering algorithms available for unsupervised classification and their use
varies by efficiency and purpose. K-means and ISODATA are the widely
used algorithms which are discussed here.
16.3.1 K-Means Clustering
K-means algorithm assigns each pixel to a group based on an initial
selection of mean values. The iterative re-definition of groups continues till
the means reach a threshold beyond which it does not change. Pixels
belonging to the groups are then classified using a minimum distance to

151 | P a g e
means or other principle. K-means clustering algorithm, thus, helps split a
given unknown dataset into a fixed number (k) of user defined clusters. The
objective of the algorithm is to minimise variability within the cluster.
The data point at the center of a cluster is known as a centroid. In most of
the image processing software, each centroid is an existing data point in the
given input data set, picked at random, such that all centroids are unique.
Initially, a randomised set of clusters is produced. Each centroid is thereafter
set to the arithmetic mean of cluster it defines. The process of classification
and centroid adjustment is repeated until the values of centroids stabilise.
The final centroids are used to produce final classification or clustering of
input data, effectively turning a set of initially anonymous data points into a
set of data points, each with a class identity.

Advantage
• The main advantage of this algorithm is its simplicity and speed which
allows it to run on large datasets.
Disadvantages
• it does not yield the same result with each run, since the resulting
clusters depend on the initial random assignments.
• it is sensitive to outliers, so, for such datasets k-medians clustering is
used and
• one of the main disadvantages to k-means is the fact that one must
specify the number of clusters as an input to algorithm.
16.3.2 ISODATA Clustering
ISODATA (Iterative Self-Organising Data Analysis Technique) clustering
method is an extension of k-means clustering method (ERDAS, 1999). It
represents an iterative classification algorithm and is useful when one is not
sure of the number of clusters present in an image. It is iterative because it
makes a large number of passes through the remote sensing dataset until
specified results are obtained. Good results are obtained if all bands in
remote sensing image have similar data ranges. It includes automated
merging of similar clusters and splitting of heterogeneous clusters. The
clustering method requires us to input maximum number of clusters that you
want, a convergence threshold and maximum number of iterations to be
performed. ISODATA clustering takes place in the following steps:
k arbitrary cluster means are established
• all pixels are relocated into the closest clusters by computing distance

152 | P a g e
between pixel and cluster.
• centroids of all clusters are recalculated, and above step is repeated
until the threshold convergence and
• if the number of clusters are within the specified number and distances
between the clusters meet a prescribed threshold, then only clustering
is considered complete.

Advantages
• it is good at finding “true” clusters within the data.
• it is not biased to the top pixels in the image.

• it does not require image data to be normally distributed and


• cluster signatures can be saved, which can be easily incorporated and
manipulated along with supervised spectral signatures.
Disadvantages
• it is time consuming and
• it requires maximum number of clusters, convergence threshold and
maximum number of iterations as an input to algorithm.

Let us Sum Up
Supervised image classification is a type of classification in which the user
or image analyst supervises' the pixel classification process. The user
assigns the various pixel values or spectral signatures to each class.
Maximum likelihood is a statistical approach to pattern recognition where
the probability of a pixel belonging to each of a predefined set of classes is
calculated.
Support vector machines are supervised non-parametric statistical learning
techniques that does not require any assumptions.

An unsupervised classifier does not compare the pixels to be categorized


with any previously collected data. Rather, it evaluates a huge number of
unknown data points and separates them into groups based on data-
specific features.
K-means is one of the most basic unsupervised learning algorithms for
solving the well-known problem of clustering. The approach uses a simple
and straight forward method to classify a given data set using a
predetermined number of clusters (assuming k clusters).

153 | P a g e
Isodata Clustering involves combining clusters with identical spectral
signatures and separating clusters with high variability.

Glossary
Spectral signature: The variation of a material's reflectance or emittance
with respect to wavelengths is known as spectral signature.
Spectral classes are clusters of pixels in the data that are uniform (or nearly
uniform) in terms of their brightness values in the various spectral channels.

A centroid is the imaginary or real location representing the center of the


cluster.
Training stage: The analyst identifies representative training areas and
develops numerical descriptions of the spectral signatures of each land
cover type of interest in the scene.
The classification stage: Each pixel in the image data set is categorized into
the land cover class it most closely resembles. If the pixel is insufficient like
any training data set it is usually labelled ‘Unknown’.
The output stage: The results may be used in several different ways. Three
typical forms of output products are thematic maps, tables and digital data
files which become input data for GIS. The output of image classification
becomes the input for GIS for spatial analysis of the terrain.

Check Your Progress


1. What is spectral pattern recognition?
Spectral pattern recognition refers to a group of classification algorithms
that use pixel-by-pixel spectral data to classify land cover automatically.
2. What are the steps involved in supervised image classification?
Training stage (Create training set)
Generate signature file.
Classify Image
3. List out the advantages of support vector machine?
Classify linearly and nonlinearly separable data using kernel functions.
Provide high classification accuracies and good generalization
capabilities.

154 | P a g e
Suggested Readings
1. Campbell J., 1989. Introduction to Remote Sensing, Guilford, New
York.
2. Thomas M., Lillesand, Ralph, Kefer, W., 1994. Remote Sensing and
Image Interpretation, John Wiley & Sons, New York.
3. Campbell J., 1989. Introduction to Remote Sensing, Guilford, New
York.

4. Thomas M., Lillesand, Ralph, Kefer, W., 1994. Remote Sensing and
Image Interpretation, John Wiley & Sons, New York.
5. Luder D., 1959. Aerial Photography Interpretation: Principles and
Application, McGraw Hill, New York,
6. Thomas M., Lillesand, Ralph, Kefer, W., 1994. Remote Sensing and
Image Interpretation, John Wiley & Sons, New York.

7. photography/concepts-aerial-photography/9687

Web Sources
1. https://gisgeography.com/image-classification-techniques-remote- sensing/
2. https://www.ukessays.com/essays/engineering/supervised-image-
classification-9746.php
3. https://gisgeography.com/image-classification-techniques-remote- sensing/
4. https://core.ac.uk/download/pdf/234663192.pdf
5. https://sites.google.com/site/dataclusteringalgorithms/k-means- clustering-
algorithm
6. https://people.utm.my/nurulhazrina/files/2015/05/L12-Unsupervised-
classification.pdf

155 | P a g e
Document Information

Analyzed document BGEOS_52 FUNDAMENTALS OF REMOTE SENSING_All Units.pdf (D173212023)

Submitted 2023-08-29 06:49:00

Submitted by

Submitter email [email protected]

Similarity 8%

Analysis address [email protected]

Sources included in the report

PGGR_9B.pdf
5
Document PGGR_9B.pdf (D165634546)

URL: https://kkhsou.ac.in/eslm/E-SLM-for-
Learner/1st%20Sem/Certificate/POST%20GRADUATE%20CERTIFICAT... 23
Fetched: 2021-11-18 06:12:15

URL: https://link.springer.com/chapter/10.1007/978-1-4615-0306-4_3
3
Fetched: 2020-03-17 18:20:04

RSGIS_Unit Two..docx
18
Document RSGIS_Unit Two..docx (D154342725)

34288801_giv15ami_GISN08_1100-16.pdf
1
Document 34288801_giv15ami_GISN08_1100-16.pdf (D21966490)

EGE 321 REMOTE SENSING AND RESOURCE MANAGEMENT.docx


10
Document EGE 321 REMOTE SENSING AND RESOURCE MANAGEMENT.docx (D120364088)

Fundamentals of Remote Sensing – II.doc


1
Document Fundamentals of Remote Sensing – II.doc (D172382888)

CHAPTER-2 The Approches new.docx


7
Document CHAPTER-2 The Approches new.docx (D34155883)

URL: https://www.sciencedirect.com/topics/agricultural-and-biological-sciences/remote-sensing
1
Fetched: 2019-10-10 18:01:17

rana-9.docx
9
Document rana-9.docx (D54864731)

38955801_ka1325ad-s_GISN08_1100-16.docx
2
Document 38955801_ka1325ad-s_GISN08_1100-16.docx (D60145795)
About Tamil Nadu Open University
Tamil Nadu Open University (TNOU), with its
Headquarters at Chennai was established in 2000 by an
Act of Tamil Nadu Legislature at the State level for the
introduction and promotion of Open University and
Distance Education in the educational and for the co-
ordination and determination of standards in such system.
The salient features of TNOU are , relaxed entry rules,
maintenance of standards, individualized study, flexible in
term of place, duration of the study, use of latest
information and communication technology, well-knit
student support services network, cost effective
programmes, collaboration and resource sharing with
other Universities.

School of Sciences
School of Sciences, established in 2004, has been offering the B.Sc. and M.Sc. programmes
in Mathematics since 2005 and B.Sc., Mathematics with Computer Application since 2007. In 2017,
B.Sc. programmes in Physics, Chemistry, Botany, and Zoology were introduced, while M.Sc.
programmes in Physics, Chemistry, Botany, and Zoology were launched in 2018. As per the
academic restructured, the Department of Geography and Apparel & Fashion Design were
merged in the School of Science in 2020 and these departments are offering B.Sc., and M.Sc.,
Programmes.

The main objective is to excite the brains and hearts of rural students through constant
inquiry and active participation in Science. The School of study has blazed a trail of information
transmission and generation, graduating over 25000 Science students across the Nation. It has
built a niche for itself in the core areas of teaching, research, consultation, administration, and
community services over the last 17 years.

The School of study consists of the Departments of Physics, Chemistry, Mathematics,


Botany, Zoology, Geography and Apparel & Fashion Design. Moreover, all the above said.
Departments offer various academic Programmes from Certificate to Research degree level
(M.Phil., & Ph.D.) in their concerned disciplines.

The Department of Geography offers the following Programmes

• B.Sc., Geography (Semester - Both Tamil & English Medium)


• M.Sc., Geography (Semester)
• M.Phil., Geography (Full Time & Part-Time)
• Ph.D., Geography (Full Time & Part-Time)

For details contact:


Phone : 044 - 24306641
E-Mail : [email protected] / [email protected]

Tamil Nadu Open University


Chennai – 600 015.
www.tnou.ac.in

You might also like