0% found this document useful (0 votes)
0 views15 pages

IVP notes

The document discusses image restoration techniques, focusing on the degradation model, noise models, and methods for estimating degradation functions. It explains how to recover original images from degraded versions by addressing issues like noise and blur using filters such as Wiener and Inverse filters. Key concepts include the causes of image degradation, modeling techniques, and the importance of working in the frequency domain for effective restoration.

Uploaded by

animeshjain0602
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
0 views15 pages

IVP notes

The document discusses image restoration techniques, focusing on the degradation model, noise models, and methods for estimating degradation functions. It explains how to recover original images from degraded versions by addressing issues like noise and blur using filters such as Wiener and Inverse filters. Key concepts include the causes of image degradation, modeling techniques, and the importance of working in the frequency domain for effective restoration.

Uploaded by

animeshjain0602
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 15

UNIT - 5

Image Restoration Degradation model, noise models, estimation of degradation function by


modeling, restoration using Weiner filters and Inverse filters

Image Restoration is the process of recovering an original, clean image from a degraded
version by reducing or reversing known distortions or degradations (like blur, noise, or motion
artifacts).

To improve the quality of an image by:


• Removing noise
• Reversing blur
• Restoring lost image details
• Enhancing interpretability or visual appearance

Key Causes of Image Degradation


1. Noise – Random variations in intensity due to sensor defects or environmental conditions.
2. Blur – Caused by motion, defocus, or atmospheric turbulence.
3. Distortion – Due to lens imperfections or imaging system limitations.
4. Missing Data – Block loss in compressed images or corrupted pixels.

Degradation Model :
A degradation model explains how a clean or original image becomes bad in quality due to some
problems like blur, noise, or other issues while capturing or transmitting it.
It helps us understand:
• Why the image looks bad
• What caused the image to degrade
• How we can fix it using image restoration
UNIT - 5

Imagine clicking a photo:


• If your hand shakes → the image becomes blurry.
• If it’s too dark → noise (grains) appear.
• If the lens is dirty → the image is unclear.
This whole process where the image goes from good to bad is called image degradation, and we
use a model (formula) to understand it.

Why Do Images Get Degraded? (Causes)


1. Blur – Due to motion or out-of-focus lens.
2. Noise – Due to poor lighting, bad sensor, or transmission errors.
3. Lens issues – Imperfect or dirty camera lens.
4. Environmental conditions – Fog, dust, or vibrations.

Degradation Function
Degradation Function represents the process that distorts or blurs the original image.
It shows how each pixel of the original image spreads due to:
1. Camera motion,
2. Out-of-focus lens,
3. Atmospheric disturbance, etc.
UNIT - 5

In the spatial domain, it is written as:

• G(u,v) → The degraded image (blurred + noisy), but in frequency form.


• H(u,v) → The degradation function (how the image got blurred), also in frequency form.
Often called PSF – Point Spread Function.
• F(u,v) → The original good image, in frequency form.
• N(u,v) → The noise added during image capture or transmission, in frequency form.

Why Estimate the Degradation Function?


To restore an image, we must know or estimate the degradation function h(x,y)h(x, y)h(x,y) or
its frequency version H(u,v)H(u, v)H(u,v).
In many real-world cases, the degradation function is not directly available, so we model it based
on:
• The appearance of the blurred image
• Assumed cause of the blur (e.g., motion, defocus, turbulence)
UNIT - 5

What is Modeling?
Modeling is the process of:
• Assuming a type of blur (like motion or defocus)
• Using mathematical functions to represent the blur
• Estimating parameters (like length, angle, spread)

Here are some common types:


1. Motion Blur Model
This happens when either the camera or the object moves while the picture is being taken.
Imagine taking a photo of a fast-moving car or shaking the camera by mistake — the image
looks stretched or smeared in one direction.
How is it modeled?
• In simple terms, the blur spreads out along the path of the movement.
• Think of it like dragging a paintbrush in a straight line, which makes a streak.
• The length of this streak is called L, and the direction the camera/object moved is called
a.

In the frequency domain (a way computers analyze images), the blur has a special sine wave
pattern that depends on the length and direction of motion.

2. Out-of-Focus Blur Model


This happens when the camera lens is not properly focused on the subject. If you try to take a
photo without adjusting focus correctly, the picture looks fuzzy and unclear.

How is it modeled?
UNIT - 5

• The blur looks like a soft circular patch around each point in the image.
• Imagine shining a flashlight through a round glass; the light spreads out evenly in a
circle.
• Inside the circle, the brightness is uniform, and outside it, it’s zero.

Result in image:
The photo looks blurry in all directions equally — like a soft, round haze.
3. Gaussian Blur Model
This blur happens due to environmental factors like atmospheric conditions (fog, heat waves) or
the camera sensor’s limits. It's like looking through a foggy window.
How is it modeled?
• The blur follows a smooth, bell-shaped curve (Gaussian function).
• Points near the center remain bright, and points farther away gradually fade out.
UNIT - 5

• The spread of the blur depends on a parameter a that controls how fuzzy the image gets.

Result in image:
The photo looks softly blurred — edges are less sharp, and the image looks smooth and fuzzy.

How Do Experts Estimate What Kind of Blur Happened?


When trying to fix a blurry image, experts usually follow these steps:
1. Observe the image carefully:
Look at the blurred image and decide what kind of blur it might be.
2. Guess the cause:
Is it because of motion? Or out-of-focus lens? Or atmospheric issues?
3. Choose the right model:
Pick the mathematical model that best fits the suspected blur.
4. Estimate parameters:
Measure or calculate details like blur length, blur angle, or blur radius.
5. Create the Point Spread Function (PSF):
The PSF is a small image that shows how a single point of light spreads in the blur.
UNIT - 5

6. Restore the image:


Use filters (like Wiener or Inverse filters) that use the PSF to reverse the blur and get a
clearer image.

Example: Motion Blur Estimation


• You see an image that looks smeared horizontally.
• You assume it’s motion blur caused by horizontal movement.
• So you pick a motion blur model where the blur is stretched sideways.
• You estimate how far (length) and at what angle the blur moved.
• Then you use this model in a filter that tries to “undo” the motion blur and restore the
sharp image.
UNIT - 5

Why Frequency Domain?


• Images are normally stored in spatial form (x, y).
• But for restoration, algorithms work better in the frequency domain (u, v) because:
o Blurring becomes multiplication (easy to reverse).
o Noise patterns are easier to detect and filter.
o Fast mathematical tools like Fourier Transform help in this conversion.

What is PSF (h(x, y))?


• PSF stands for Point Spread Function.
• It tells how a single point in the image spreads (blurs) due to degradation.
• For example, if a point spreads into a small circle → it’s blurred.

Real-Life Example
Let’s say you took a picture of a moving car at night:
• Original image (f): Clear car
• Blur (h): Because the car moved
• Noise (η): Because it was dark
• Final image (g): Blurry and grainy photo
This is what the degradation model explains.

It is used in:
o Medical images (MRI/CT scan)
o Satellite images
o Old photo repair
o CCTV footage improvement

Problems in Real Use


• Sometimes we don’t know what caused the degradation (blur or noise).
• Noise can be random and hard to guess.
• We need smart tools to guess the blur and remove noise correctly.
UNIT - 5

Restoration using Weiner filters and Inverse filters


1. Inverse Filtering (Deterministic Approach)
To reverse the degradation process (like blurring) by applying the inverse of the degradation
function, assuming no noise or negligible noise.

Mathematical Foundation:

Assumptions:
• The degradation function H(u,v)H(u,v) is known and invertible.
• Noise is not present or is very small.
• Degradation is linear and spatially invariant.

Working Steps:
1. Input the degraded image g(x,y).
UNIT - 5

Problem with Inverse Filter:

• Noise and errors get amplified


• Results in artifacts and distortion

Example Use Case:


• De-blurring a scanned document with known motion blur.
• Recovering an image blurred by a known lens function.

2. Wiener Filtering (Statistical Approach)


To minimize the mean square error (MSE) between the original image and the restored image,
considering both blur and noise.

Mathematical Foundation:
UNIT - 5

Degradation model with noise:

Assumptions:
• The degradation function H(u,v)H(u,v) is known.
• Noise is additive, Gaussian, and statistically independent of the image.
• Power spectra of noise and image are known or approximated.

Working Steps:

Advantages:
• Considers noise along with blur.
UNIT - 5

• Produces better results than inverse filter in real-life applications.


• Reduces mean square error between estimated and original image.

Disadvantages:
• Requires estimation of noise and signal power spectra.
• May be complex to compute precisely if noise characteristics are unknown.

Example Use Case:


• Restoring astronomical images degraded by atmospheric blur and sensor noise.
• De-noising and de-blurring medical images like MRI or CT scans.

Comparison Table

Feature Inverse Filter Wiener Filter

Handles Noise No Yes

Assumes degradation
Known Known
model

Additional info needed None Noise-to-signal power ratio or PSDs

Robust to small No – amplifies Yes – controls amplification using


H(u,v)H(u,v) noise regularization

Poor if noise
Accuracy in real-world High
exists

Computational cost Low Medium (requires more estimation)

Deterministic
Based on Statistical estimation (MSE minimization)
division

Noise in Images: Noise in images is any unwanted random variation of brightness or color
that makes the image look grainy, speckled, or distorted. It can come from:
• Faulty camera sensors
• Low light conditions
UNIT - 5

• Transmission errors
• Electronic interference

Common Types of Noise Models


1. Gaussian Noise
• Comes from natural sources like thermal noise in electronics.
• Looks like small variations in brightness spread all over the image

2. Salt and Pepper Noise


• Appears as random black and white dots in the image.
• Caused by sudden disturbances like transmission errors or faulty memory.

3. Poisson Noise (Shot Noise)


• Occurs in low-light photography due to the randomness of photon detection.
UNIT - 5

• Intensity of noise depends on signal (brighter areas have more noise).

4. Speckle Noise
• Mostly affects radar, ultrasound, or medical images.
• Looks like a grainy texture — multiplicative noise (noise depends on signal).0

For Remove Noise


You can use filters like:
• Gaussian filter: smooths Gaussian noise
• Median filter: removes salt and pepper noise
• Wiener filter: works well for Gaussian & Poisson noise
• Anisotropic diffusion or bilateral filters: preserve edges while reducing noise.

Summary Table:
Noise Type Appearance Cause Model Type

Gaussian Soft grainy noise Sensor/electronic noise Additive


UNIT - 5

Noise Type Appearance Cause Model Type

Salt & Pepper Black & white dots Data loss or bit errors Impulse

Poisson Photon randomness Low light conditions Signal-dependent

Speckle Grainy texture Coherent imaging systems Multiplicative

You might also like