0% found this document useful (0 votes)
61 views43 pages

Detec%on and Es%ma%on Theory: Class Notes EE 768

This document outlines the tentative course outline for a class on Detection and Estimation Theory. It covers topics such as simple binary hypothesis testing, detection of continuous time signals in additive noise, estimation theory fundamentals, and properties of estimators. Examples of detection problems discussed include detection of known signals in noise, signals with unknown parameters, and random signals. The goal of the course is for students to learn to formulate various detection and estimation problems and solve them using statistical tools.

Uploaded by

dileep bapatla
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
61 views43 pages

Detec%on and Es%ma%on Theory: Class Notes EE 768

This document outlines the tentative course outline for a class on Detection and Estimation Theory. It covers topics such as simple binary hypothesis testing, detection of continuous time signals in additive noise, estimation theory fundamentals, and properties of estimators. Examples of detection problems discussed include detection of known signals in noise, signals with unknown parameters, and random signals. The goal of the course is for students to learn to formulate various detection and estimation problems and solve them using statistical tools.

Uploaded by

dileep bapatla
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 43

Detec%on

and Es%ma%on Theory

Class Notes
EE 768
EE 768: Detec%on & Es%ma%on Theory
Tenta%ve Course Outline (subject to change)

Introduc)on: Structure of detec%on & es%ma%on problems.


Simple Binary Hypothesis Tes)ng: Decision Criteria; Maximum Likelihood, Neyman-Pearson,
Bayes risk, Probability of Error and Min-Max Criteria; Receiver Opera%ng Characteris%cs;
Single and Mul%ple Observa%ons cases; Composite Hypothesis Tes%ng.
Mary Hypotheses or Mul)ple Decision Theory: Mul%ple decisions; Bayes risk; Probability of
Error; General and Gaussian cases;
Detec)on of Con)nuous Time Signals: Detec%on of Signals in Addi%ve White Gaussian
Noise; Detec%on in Non-white Gaussian Noise; Signals with Unwanted Parameters (Random
phase, Random amplitude and phase).
Es)ma)on Theory Fundamentals: Maximum Likelihood, Bayes Cost and Minimum Variance
Es%mators; Rela%onship of Es%mators.
Es)ma)on with Gaussian Noise: Linear Observa%ons; Sequen%al Observa%ons; Non-linear
Es%ma%on; E-M algorithm; State Es%ma%on and Kalman Filtering.
Proper)es of Es)mators: Unbiased Es%mators; Ecient Es%mators; Asympto%c Proper%es.
Main Text: Detec%on, Es%ma%on & Modula%on Theory-Part 1: Harry L.Van Trees, John
Wiley; Paperback reprint, 2003.

January 2012 Detec%on & Es%ma%on, SP, IIT Delhi 2


Learning Outcomes

Be able to formulate dierent kinds of Signal Detec%on


Problems.
Be able to use sta%s%cal and mathema%cal tools to solve a
hierarchy of Detec%on Problems and design op%mum
processors.
Be able to formulate dierent kinds of parameter and signal
es%ma%on problems.
Be able to solve basic problems using sta%s%cal and
mathema%cal tools and design op%mal es%mators.

January 2012 Detec%on & Es%ma%on, SP, IIT Delhi 3


Introduc%on: Detec%on Theory
Deals with detec%ng the presence of useful signals in the
presence of noise and interference.

Applica%ons: Digital Communica%ons, Radar, Sonar,


Radio-astronomy; many others.

Broadly, we may look for


Known signals.
Par%ally known signals, or
Completely unknown and Random signals
in noise.

January 2012 Detec%on & Es%ma%on, SP, IIT Delhi 4


Examples
Known Signals in Noise Problems
Digital Communica%on via FSK:

s1 (t) = sin 1t,s0 (t) = sin 0 t,0 t T

Transmided over a communica%on channel, which yields
an undistorted replica, with thermal noise:

r(t) = s1 (t ) + n(t),or

r(t) = s0 (t ) + n(t),0 t T
Receiver: RF amplier + Processor to decide which of the
two signals was transmided over the %me interval.

Detec%on Theory Problem: Design and Evaluate the
Processor
January 2012 Detec%on & Es%ma%on, SP, IIT Delhi 5
Signals with Unknown Parameters in Noise

FSK problem with a random phase drih in oscillator or


medium:
r( t) = sin(1t + 1 ) + n( t),0 t T
r( t) = sin( 0 t + 0 ) + n( t),0 t T

1 and 0:unknown, but constant phase angles.


Signal not completely known even in absence of noise.
This uncertainty must be taken into account while
designing the detector.

January 2012 Detec%on & Es%ma%on, SP, IIT Delhi 6


Signals with Unknown Parameters in Noise
Ac%ve Radar/Sonar Problem:

Tx. Signal: s(t) = sin c t,0 t T



Received signal when target present:

r(t) = Vr sin( c (t ) + r ) + n(t), t + T


= n(t),0 t , + T < t .
When target absent:

r(t) = n(t),0 t

Three
unknown parameters even in the absence of noise.

January 2012 Detec%on & Es%ma%on, SP, IIT Delhi 7


Random Signals in Noise

Passive sonar detec%on: Receiver listens for noise generated


by enemy submarines: Engines, Propellers and other
machinery.

Acous%c signals travel through ocean to hydrophones in the


detec%on system.

Desired signal: Itself a sample func%on of a random process.

Hydrophone generates self-noise and picks up sea noise.

January 2012 Detec%on & Es%ma%on, SP, IIT Delhi 8


Random Signals in Noise

Passive Sonar Detec%on Problem


r(t) = s (t) + n(t) :Signal present

r(t) = n(t) :Signal absent

January 2012 Detec%on & Es%ma%on, SP, IIT Delhi 9


Random Signals in Noise

Troposcader/Underwater Acous%c/Fading Channel


Communica%on Problem:
Transmided signals: s1(t) and s0(t), as in FSK.
Received signals:
r(t) = s1 (t) + n(t) :when s1(t) transmided

r(t) = s 0 (t) + n(t) :when s0(t) transmided

s1(t) and s0(t): sample func%ons of random processes


centred around 1 and 0 respec%vely.

January 2012 Detec%on & Es%ma%on, SP, IIT Delhi 10


Random Signals in Noise

In such cases, the signals to be detected lack a determinis%c


component.

Detector Design: Must be based on the dierence in the


sta%s%cal proper%es of the random processes of which s1(t)
and s0(t) are sample func%ons.

January 2012 Detec%on & Es%ma%on, SP, IIT Delhi 11


Mul%ple Decision Problems

Mary digital modula%ons.

Classica%on problems from observed features data etc.

January 2012 Detec%on & Es%ma%on, SP, IIT Delhi 12


Hierarchy: 3 levels

Known signals in Synchronous digital communica%ons


noise Padern recogni%on problems

Signals with Ac%ve radar and sonar; target classica%on


unknown Digital communica%on systems without phase
reference or over slowly fading channels
parameters

Digital comm. Over scader links; Passive sonar


Random Signals detec%on.
in noise Seismic detec%on systems; radio-astronomy

January 2012 Detec%on & Es%ma%on, SP, IIT Delhi 13


Classical Detec%on Theory
Deni%ons

Components of Decision Theory Problems:




H1 Probabilis%c
Source Transi%on Observa%on
space
H0 Mechanism



Decision

January 2012 Detec%on & Es%ma%on, SP, IIT Delhi 15


Components of Decision Theory Problems

Source: generates an output: One of two (or M) choices.

Digital Communica%ons: 1s and 0s of informa%on to be


transmided.
Medical ECG: H1: Pa%ent had a heart adack; H0: absent;
Speaker Classica%on: Speaker German, Bri%sh or American;
Male or Female; (6 hypotheses).
Radar/Sonar: H1:Target present; H0: Target absent, in a given
range-azimuth cell.

January 2012 Detec%on & Es%ma%on, SP, IIT Delhi 16


Components of Decision Theory Problems
Probabilis)c Transi)on Mechanism: device that knows the source output,
but modies it to generate a point in a suitable observa)on space
according to a probability law.

H1 +1 n
r Observa)on space:
+
H0 -1 one-dimensional
Transi%on Mechanism

January 2012 Detec%on & Es%ma%on, SP, IIT Delhi 17


Example 1.1

H1: r = 1 + n; H0: r = -1+n


Probability densi%es of r on the two hypotheses:

January 2012 Detec%on & Es%ma%on, SP, IIT Delhi 18


Example 1.1

January 2012 Detec%on & Es%ma%on, SP, IIT Delhi 19


Example 1.2

Source generates two numbers in a sequence.

H1 : r1 =1+ n1;r2 =1+ n2


H0 :r1 = 1+ n1;r2 = 1+ n2
Observa%on Space: 2-dimensional.



January 2012 Detec%on & Es%ma%on, SP, IIT Delhi 20
Example 1.2

January 2012 Detec%on & Es%ma%on, SP, IIT Delhi 21


Example 1.2

January 2012 Detec%on & Es%ma%on, SP, IIT Delhi 22


Decision Rule

Decision Rule: Guesses which hypotheses is true: Assigns each


point to one of the hypotheses.

Decision Rule mapping: Subject of Detec%on Theory.

January 2012 Detec%on & Es%ma%on, SP, IIT Delhi 23


Simple Binary Hypothesis Tests

Let observa%on space correspond to N observa%ons:


(r1,r2,...,rN )
Each observa%on: A point in an N-dimensional space, and
denoted by r.
R is generated either under H0 or H1, with a-priori
probabili%es P0 and P1, and known condi%onal probabili%es
pr / H (R /H 0 ) and pr / H (R /H1) respec%vely.
0 1

Use this informa%on to develop a suitable decision rule.


January 2012 Detec%on & Es%ma%on, SP, IIT Delhi 24
Decision Criteria

(i) Bayes Criterion: Assumes knowledge of P0 and P1;


Each of the 4 decisions assigned a cost: C00,C10,C11,C01.
The average cost or risk is dened as

= C00 P0 Pr(sayH 0 / H 0 true)


+C10 P0 Pr(sayH1 / H 0 true)
+C11P1 Pr(sayH

1 / H1true)

+C01P1 Pr(sayH 0 / H1true)

January 2012 Detec%on & Es%ma%on, SP, IIT Delhi 25


Decision Rule: Bayes Criterion

Decision Rule must say either H1 or H0, based on the observed


vector.
: Rule for dividing the total observa%on space Z into two
parts: Z0 and Z1.

January 2012 Detec%on & Es%ma%on, SP, IIT Delhi 26


Decision Rule: Bayes Criterion

= C00 P0 pr / H 0 (R /H 0 )dR
Z0

+C10 P0 pr / H 0 (R /H 0 )dR
Z1

+C11P1 pr / H1 (R /H1 )dR


Z1

+C01P1 pr / H1 (R /H1 )dR


Z0

January 2012 Detec%on & Es%ma%on, SP, IIT Delhi 27


Decision Rule: Bayes Test

It is natural to assume that the cost of wrong decisions will be


larger than the corresponding correct decision:

i.e., C10 > C00 and C01 > C11.

Bayes Test: Choose the decision regions Z0 and Z1 such that


the average risk R is minimized.

January 2012 Detec%on & Es%ma%on, SP, IIT Delhi 28


Decision Rule: Bayes Test

Now
Z = Z0 + Z1 = Z0 Z1
Hence

= Coo P0 pr / H 0 (R /H 0 )dR + C10 P0 p (R /H 0 )dR



Z0 Z Z 0
r / H10

+C11P1 p r / H1 (R /H1 )dR + C01P1 pr / H1 (R /H1 )dR


Z Z 0 Z0

January 2012 Detec%on & Es%ma%on, SP, IIT Delhi 29


Bayes Test

Can be wriden as follows.

Fixed cost Cost controlled by those points of R


assigned to Z0

= P0C10 + P1C11 + {[P1 (C01 C11 ) pr / H1 (R /H1 )]


Z0

[P0 (C10 C00 ) pr / H 0 (R /H 0 )]}dR

January 2012 Detec%on & Es%ma%on, SP, IIT Delhi 30


Bayes test

The two terms inside the brackets are posi%ve


R s.t. 2nd term > rst term: Assign to Z0; since they contribute
a nega%ve amount to the integral.
Similarly, R s.t., 1st term > 2nd term: Exclude from Z0 (i.e. assign
to Z1).
Assign R arbitrarily, when the two terms are equal.
Region Par%%oning Equa%on:
P1(C01 C11)pr/H1 (R/H1)P0 (C10 C00 )pr/H 0 (R/H0 ) :Assign R to Z1 (say H1 is true)
else Assign R to Z0 (say H0 is true)

January 2012 Detec%on & Es%ma%on, SP, IIT Delhi 31


Bayesian Test: Likelihood Ra%o Test

Alterna%vely pr / H1 (R /H1) H1 P0 (C10


C00 )
Else H0
pr / H 0 (R /H 0 ) P1 (C01 C11 )

Likelihood Ra%o = p r / H (R / H 1 ) = (R)
1 : 1-dim r.v.
pr / H 0 (R / H 0 )


independent of dimension of R.
(
> : H1
Likelihood Ra%o Test: (R)
< : H0
P0 (C10 C00 )
Threshold: =
P1(C01 C11)

January 2012 Detec%on & Es%ma%on, SP, IIT Delhi 32

Remarks

All data processing/opera%ons involved in compu%ng (R).


Not aected by a-priori probabili%es or cost assignments:
important and nice: in many cases these values are at best an
educated guess.
Allows building a processor independent of these parameters.
leh as a variable parameter to accommodate changes in our
es%mates of a-priori probabili%es and costs
An equivalent test: (
> ln : H1
ln(R)
< ln : H0

January 2012 Detec%on & Es%ma%on, SP, IIT Delhi 33


Likelihood Ra%o Processors

January 2012 Detec%on & Es%ma%on, SP, IIT Delhi 34


Example1
H1 : ri = m + n i ,i = 1,2,...., N
H 0 : ri = n i ,i = 1,2,...., N
1 ni sta%s%cally
pn i ( X ) = exp( X 2 /2 2 ) independent
2

Then we have

1 % (Ri m)2 (
pri / H1 (Ri /H1) = pni (Ri mi ) = exp' 2 *
2 & 2 )
and
1 % R2 (
pri / H 0 (Ri /H 0 ) = pn i (Ri ) = exp' i 2 *
2 & 2 )

January 2012 Detec%on & Es%ma%on, SP, IIT Delhi 35
Example 1: Model

January 2012 Detec%on & Es%ma%on, SP, IIT Delhi 36


Example 1: Contd.

Since ni are sta%s%cally independent,


N
1 % (Ri m)2 (
pr / H1 (R / H1) = exp' 2 *
i =1 2 & 2 )

N % Ri2 (
1
pr / H 0 (R / H 0 ) = exp' 2 *
i=1 2 & 2 )
Hence
N & (Ri m) 2 )
1
2 exp(' 2 2 +*
(R) = i =1 N
1 & Ri2 )
2 exp(' 2 2 +*
i =1

January 2012 Detec%on & Es%ma%on, SP, IIT Delhi 37


Example 1: likelihood ra%o test

Aher cancelling common terms and taking log:


N
m Nm 2
ln (R) = 2

Ri 2 2
i =1

Log-likelihood ra%o test:


m
N
Nm 2
2
Ri 2 2 ln H1 true else
H0 true
i =1

January 2012 Detec%on & Es%ma%on, SP, IIT Delhi 38


Example 1: Sucient Sta%s%c

Or equivalently
N
2 Nm
Ri m ln + 2 =
i=1

Sucient sta%s%c l(R)


Knowing value of l(R) is just as good as knowing R, in as far as
decision making is concerned.

January 2012 Detec%on & Es%ma%on, SP, IIT Delhi 39


Example 2

Hi: (r1,r2,,rN) sta%s%cally independent zero mean Gaussian


rvs with variance i2.
N
1 % Ri2 (
pr / H1 (R /H1) = exp' 2 *
i=1 21 & 21 )
1 N % R 2 (
pr / H 0 (R /H 0 ) = exp' i 2 *
i=1 2 0 & 2 0 )

Likelihood ra%o test becomes
1$ 1 1 ' N 0
& 2 2 ) Ri + N ln
2
ln : Declare H1 or else H0
2 % 0 1 ( i =1 1

January 2012 Detec%on & Es%ma%on, SP, IIT Delhi 40


Example 2: Sucient Sta%s%c
N
Here
l(R) = i
R 2

i =1

And an equivalent test (when 12 > 02) is

20212 & 0 )
l(R) 2 2 ( ln N ln +=
1 0 ' 1 *
: Declare H1 or else H0

Example 3 from Van Trees: Self-Reading Exercise: pp. 43-44.

January 2012 Detec%on & Es%ma%on, SP, IIT Delhi 41


Bayes Tests: Some Special cases

Let C00 = C11 = 0, C01, C10 = 1. Then

= P0 pr / H10 (R/H 0 )dR + P1 pr / H1 (R/H1)dR


Z1 Z0

= Total Probability of making an error = Pe
Thus, in this case: Bayes Test Minimiza%on of Pe.
The test becomes:
P0
ln (R) ln = ln P0 ln(1 P0 ) = 0,whenP 0 = P1
P1
: For H1 or else H0

Usually the case in digital communica%ons: Min. Pe Receivers.


January 2012 Detec%on & Es%ma%on, SP, IIT Delhi 42
Assignment # 1 and #2
Recommended for self-prac%ce: Problems 2.2.1 to 2.2.12;
Mandatory Submission: Problem Nos.
2.2.1 2.2.3 2.2.6 2.2.8 2.2.10 2.2.11
2.2.13 2.2.14 2.2.16 2.2.18 2.2.19 2.2.21
2.3.1 2.3.2 2.3.4 2.3.6 2.3.7

Submission Date: Row 1: Aher 1 week: January 23, 2017.


Rows 2 & 3: Aher 2 weeks: January 30, 2017.
Students are encouraged to try and solve the problems
independently. There is no bar on discussions, but the submided
solu%on should be your own.
Learning Objec%ve: Be able to formulate likelihood ra%os and
calculate ROCs for a variety of problems.

January 2012 Detec%on & Es%ma%on, SP, IIT Delhi 43

You might also like