0% found this document useful (0 votes)
76 views29 pages

6 Histogram Based Information Extraction

This document discusses digital photogrammetry and histogram-based information extraction from images. It introduces histograms and probability density functions (pdfs) that can be derived from histograms. Various statistical measures can be computed from the pdf, including average, variance, normalized variance, moments, uniformity, entropy, skewness, and kurtosis. These measures can help characterize image texture. Joint histograms of image channels are also discussed, which provide information about the relationship between two channels. MATLAB code examples are provided.

Uploaded by

elhaj sabeel
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
76 views29 pages

6 Histogram Based Information Extraction

This document discusses digital photogrammetry and histogram-based information extraction from images. It introduces histograms and probability density functions (pdfs) that can be derived from histograms. Various statistical measures can be computed from the pdf, including average, variance, normalized variance, moments, uniformity, entropy, skewness, and kurtosis. These measures can help characterize image texture. Joint histograms of image channels are also discussed, which provide information about the relationship between two channels. MATLAB code examples are provided.

Uploaded by

elhaj sabeel
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 29

Digital Photogrammetry

Histogram-Based Information Extraction

Dr. Gamal Hassan Seedahmed


Dept. of Surveying Eng.
Faculty of Eng.
University of Khartoum

[email protected]
Image & Its Red Channel Histogram
Transforming the Histogram into
Probability Distribution
• This is can be done
nk
by dividing every bin P( Z k ) 
in the histogram by M N
the total number of
pixels in the image. In
k  255

 P( Z
other words, each bin
will be divided by the k) 1
size of the image.
k 0
P: probability, Zk is the intensity level, nk is the number of times that intensity Zk
Occurs in the image, M x N is the number of rows and columns of the image
(image size).
A MATLAB Code to compute the
probability from the histogram
An Example of a Color Image

Histograms Probabilities
From now on, let us call the
probability density of the image: the
probability density function or for
short:

(pdf)
Applications of the pdf
It can be use to compute:
• The average.
• Variance.
• Normalized variance.
• Nth moments.
• Uniformity.
• Entropy.

These quantities can be used to


characterize the image texture.
Image Average Based on the
pdf

k  255
Average  Z
k 0
k ( P zk )
A MATLAB Code to compute the
image average
An Example
Image Variance Based on the
pdf

k  255
2
   (Z
k 0
k
2
 Average ) ( P z k )
A MATLAB Code to Compute the
image variance
An Example
Normalized Image Variance Based on the
pdf
1
R  1
 2
1 2
( L  1)
  
255 255

It is better than the variance because its value is


restricted between 0 & 1.
MATLAB Code to compute the
Normalized Image Variance
An Example
Nth Moments Based on the
pdf
k  255
n   (Z
k 0
k  Average ) ( P z k ) n

For n=0, 1, 2, 3, 4, …., ….

n is the Nth Moments . If n=2, then this will lead to the variance.
Image Skewness Based on the
pdf
k  255
3   (Z
k 0
k  Average ) ( P zk ) 3

3
Skewness  3

Where σ is the standard deviation. If skewness=0 then we have a
Symmetrical pdf, if it is positive then it skewed to the right, and if it is
negative then it skewed to the left.
Image Kurtosis Based on the
pdf
k  255
4   (Z
k 0
k  Average ) ( P zk ) 4

4
Kurtosis  4

Where σ is the standard deviation. Kurtosis measures the peakedness of
the distribution.
Image Uniformity (U) Based on the
pdf

k  255
U (Z )  
k 0
P(Z k ) 2
MATLAB Code to Compute the
Image Uniformity
An Example
Image Entropy Based on the
pdf

k  255
Entropy    P( Z
k 0
k )  log 2 P( Z k )

Entropy is a measure of variability. It zero for a homogenous image (all


the pixels have a similar values).
Joint Histogram
• It is the frequency of occurrence of two events.
• For example, we can build the joint histogram of
the red and green channels of a color image.
• Joint histogram can be viewed as a tool to fuse
information from multiple sources.
• Joint histogram is a 2D surface.
• Joint histogram of the same source (e.g., red
channel) with itself will produce the classical
histogram along the diagonal of the 2D histogram
How to Compute the Joint
Histogram
Joint Histogram for Red & Green
Channels
Joint Histogram for Red & Blue
Channels
Joint Histogram for Red & Blue
Channels
Joint Histogram for Red & Red Channel

You might also like