0% found this document useful (0 votes)
2 views21 pages

Module 2 - 2

The document discusses Gaussian random variables, including their probability density function (PDF), properties of jointly Gaussian random variables, and transformations of multiple random variables. It also covers complex random variables and the central limit theorem, emphasizing the significance of Gaussian distributions in modeling various physical phenomena. Key concepts include independence, covariance, and the effects of linear transformations on Gaussian variables.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views21 pages

Module 2 - 2

The document discusses Gaussian random variables, including their probability density function (PDF), properties of jointly Gaussian random variables, and transformations of multiple random variables. It also covers complex random variables and the central limit theorem, emphasizing the significance of Gaussian distributions in modeling various physical phenomena. Key concepts include independence, covariance, and the effects of linear transformations on Gaussian variables.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 21

Gaussian Random Variable

Easy to use and many physical phenomenon can be


modeled as Gaussian rv. Eg noise
PDF of Gaussian rv can be written as
1 𝑥 − 𝑋ത 2
𝑓𝑋 𝑥 = 𝑒𝑥𝑝 −
2𝜋𝜎 2 2𝜎 2
Gaussian PDF has two parameters
𝑋ഥ and 𝜎 2 and it is centered about
ത and has width proportional to
x=𝑋,
variance 𝜎 2
Jointly Gaussian Random Variable
 Two variables are jointly Gaussian Joint Gaussian if their joint density function is,
𝑓𝑋𝑌 𝑥, 𝑦
1 ത 2 2𝜌 𝑋−𝑋ത 𝑌−𝑌ത (𝑌−𝑌)
(𝑋−𝑋) ത 2
1 −
2(1−𝜌2 ) 𝜎𝑋 2

𝜎𝑋 𝜎𝑌
+
𝜎𝑌 2
= 𝑒
2𝜋𝜎𝑋 𝜎𝑌 1 − 𝜌2

𝐸 𝑋 = 𝑋,ത 𝐸 𝑋 − 𝑋ത 2 = 𝜎𝑋 2
ത E Y − 𝑌ത 2 = 𝜎𝑌 2
𝐸 𝑌 = 𝑌,
𝐸 𝑋 − 𝑋ത 𝑌 − 𝑌ത = 𝜌𝜎𝑋 𝜎𝑌
 Peak is at (𝑋,
ത 𝑌)
ത and maximum value can be obtained fromm
2
 𝑓𝑋,𝑌 𝑥, 𝑦 ≤ 𝑓𝑋,𝑌 𝑋,
ത 𝑌ത =
2𝜋𝜎𝑋 𝜎𝑌 1−𝜌2
 Locus of constant value is an ellipse
𝜌 = 0 ⇒ X and Y Uncorrelated, then
𝑓𝑋,𝑌 𝑥, 𝑦 = 𝑓𝑋 𝑥 𝑓𝑌 𝑦 , statistically Independent
∞ ത 2
(𝑥−𝑋)
1 −
2𝜎 2
𝑓𝑋 𝑥 = න 𝑓𝑋,𝑌 𝑥, 𝑦 𝑑𝑦 = . . . = 𝑒 𝑋
−∞ 2𝜋𝜎𝑋
∞ (𝑦−𝑌)ത 2
1 −
2𝜎 2
𝑓𝑌 𝑦 = න 𝑓𝑋,𝑌 𝑥, 𝑦 𝑑𝑥 = . . . = 𝑒 𝑌
−∞ 2𝜋𝜎𝑌
Any uncorrelated gaussian rvs also are statistically
independent.

Correlated rvs X and Y can be converted into two


statistically independent rvs through coordinate rotation
(linear transformation) through an angle
1 −1 2𝜌𝜎𝑋 𝜎𝑌
𝜃= 𝑡𝑎𝑛
2 𝜎𝑋 2 − 𝜎𝑌 2
N- Random Variable
N random variables X1, X2, …. XN are jointly Gaussian if their joint density
function can be written as
−1 1ൗ2
𝐶𝑋 𝑥 − 𝑋ത 𝑡
𝐶𝑋 −1
𝑥 − 𝑋ത
𝑓𝑋1 ,……𝑋𝑁 𝑥1 , … . 𝑥𝑁 = 𝑁ൗ 𝑒𝑥𝑝 −
2𝜋 2 2

𝑥1 − 𝑋ത1
𝑥2 − 𝑋ത2 𝐶11 𝐶12 … 𝐶1𝑁
where 𝑥 − 𝑋ത = .. and 𝐶𝑋 = 𝐶. 21 . 𝐶21 … 𝐶2𝑁 =
.
. 𝐶 𝑁1 𝐶 𝑁2 … … . 𝐶 𝑁𝑁
𝑥𝑁 − 𝑋ത𝑁
Where 𝐶𝑋 is called the covariance matrix of N random variable
𝜎 2𝑋𝑖 𝑖=𝑗
𝐶𝑖𝑗 = 𝐸 𝑋𝑖 − 𝑋ഥ𝑖 𝑋𝑗 − 𝑋ഥ𝑗 =ቐ
𝐶𝑋𝑖 ,𝑋𝑗 𝑖 ≠ 𝑗
• Special case when N=2, the covariance matrix

𝜎 2𝑋1 𝜌𝜎𝑋1 𝜎𝑋2


• 𝐶𝑥 =
𝜌𝜎𝑋1 𝜎𝑋2 𝜎 2𝑋2
1 𝜌

1 𝜎 2 𝑋1 𝜎𝑋1 𝜎𝑋2
−1
• 𝐶𝑋 = 𝜌 1
1−𝜌2

𝜎𝑋1 𝜎𝑋2 𝜎 2 𝑋2

−1 1
• 𝐶𝑋 =
1−𝜌2 𝜎 2 𝑋 𝜎 2
1 𝑋
2
Some properties of Joint Gaussian Random Variables

• Gaussian RV’s are completely defined through


their First and second moments: Mean, Variance
and Covariance.
• If the random variables are uncorrelated then
they are statistically independent
• Linear transformation of Gaussian RV’s is also
Gaussian RV’s.
• Marginal Density is also Gaussian 𝑓𝑋1 ,𝑋2 (𝑥1 , 𝑥2 )
• Conditional Density 𝑓𝑋1 ,𝑋2 ,𝑋3 (𝑥1 , 𝑥2 |𝑥3 ) is also
Gaussian.
Transformation of Multiple Random Variables
1. Single Function transformation
2. Multiple Function Transformation

Single Function transformation

𝑌 = 𝑔(𝑋1 , 𝑋2 , … , 𝑋𝑁 )
𝐹𝑌 𝑌 = 𝑃 𝑌 ≤ 𝑦 = 𝑃{𝑔(𝑋1 𝑋2 𝑋3 ) ≤ 𝑦}
This probability is associated with all points in the
(𝑥1, 𝑥2,…, 𝑥𝑛 ) hyperspace that map such that
𝑔 𝑥1 , 𝑥2,…, 𝑥𝑁 ≤ 𝑦 for any y.
Distribution function and Density function are
Linear Transformation of Gaussian Random Variables
1.Two gaussian random variables X1 and X2 have
zero mean and variance 𝜎 2𝑋1 =4 and 𝜎 2𝑋2 =9.
their covariance CX1,X2 equals 3. If X1 and X2 are
linearly transformed to new variable Y1 and Y2
according to
Y1=X1-2X2
Y2=3X1+4X2
Find mean, variance and covariance of Y1 and Y2
2. Zero mean gaussian random variable X1, X2 and
X3 have a covariance matrix
4 2.05 1.05
𝐶𝑋 = 2.05 4 1.05
1.05 2.05 4
are transformed into a new r.v
Y1=5X1+2X2-X3
Y2=-X1+3X2+X3
Y3=2X1-X2+2X3
(a) Find the covariance matrix of Y1, Y2 and Y3
(b) Write an expression for the joint pdf of Y1, Y2
and Y3
3. Three random variable X1, X2 and X3 represents samples of a random
noise voltage taken at 3 times.
Their covariance matrix is defined by

3 1.8 1.1
𝐶𝑋 = 1.8 3 1.8
1.1 1.8 3

A transformation matrix
4 −1 −2
T= 2 2 1
−3 −1 3

Converts the variable to new variables Y1, Y2 and Y3.

(a) Find the covariance matrix of Y1, Y2 and Y3


(b) Write an expression for the joint pdf of Y1, Y2 and Y3
4.Two gaussian random variables X1 and X2 are
defined by the mean and covariance matrices
ത 2
𝑋 =
−1
5 −2/ 5
𝐶𝑋 =
−2/ 5 4
Two new random variables Y1 and Y2 are formed
using the transformation
1 1/2
𝑇 =
1/2 1
Find the matrices (a) 𝑌ത b. 𝐶𝑌 c. Also find the
correlation coefficient of Y1 and Y2
5. Probability density function of a random variable are
𝑓𝑋 𝑥 = 𝑒 −𝑥, 𝑥 > 0,
𝑓𝑌 𝑦 = 𝑒 −𝑦, 𝑦 > 0,
Two new random variables are defined in terms of X, Y:
U=X+Y,
V=X/(X+Y)

Find (i) joint density function fUV (u,v)


(ii) check if U and V are independent
Complex Random Variable
Let Z be the complex random variable
𝑍 = 𝑋 + 𝑗𝑌
Where X and Y are real random variables
𝐸 𝑍 = 𝐸 𝑋 + 𝑗𝐸 𝑌
𝜎2𝑍 = 𝐸 𝑍 − 𝑍ҧ 2
Where,
𝑍ҧ - mean
𝑍 = 𝑋2 + 𝑌2
Let Zm and Zn be complex random Variable 𝑚 ≠ 𝑛
𝑍𝑚 = 𝑋1 + 𝑗𝑌1 and 𝑍𝑛 = 𝑋2 + 𝑗𝑌2
Correlation of 𝑍𝑚 and 𝑍𝑛 is

𝑅𝑍𝑚 𝑍𝑛 = 𝐸 𝑍 𝑚 𝑍𝑛 = 𝐸 𝑋1 − 𝑗𝑌1 𝑋2 + 𝑗𝑌2
Covariance of 𝑍𝑚 and 𝑍𝑛
𝐶𝑍𝑚 𝑍𝑛 = 𝐸 𝑍𝑚 − 𝑍ҧ 𝑚 ∗ 𝑍𝑛 − 𝑍ҧ 𝑛 𝑚≠𝑛
*- complex conjugate
If covariance 𝐶𝑍𝑚 𝑍𝑛 = 0, Zm and Zn are uncorrelated

𝑅𝑍𝑚 𝑍𝑛 = E 𝑍 𝑚 𝐸 𝑍𝑛 𝑚 ≠ 𝑛
Statistical independence guarantee that 𝑍𝑚 𝑎𝑛𝑑 𝑍𝑛 uncorrelated
6. A complex variable Z is defined by
𝑍 = cos 𝑋 + j sin Y
Where X and Y are independent real random
variables uniformly distributed from - to 
(a) Find the mean value of Z
(b) Find the variance of Z
Central Limit Theorem
The central limit theorem pdf of a large number of
sum of rv approaches Gaussian distribution.
approximately normally distributed.

1 𝑦 − 𝑌ത 2
𝑓𝑌 𝑦 = 𝑒𝑥𝑝 −
2𝜋𝜎𝑌 2 2𝜎𝑦 2
7. Let X1, X2, X3, ....X10 are independent random variables
that has mean of 0.5 and variance of 0.25. Find fW(w)
using central limit theorem.
8. Given W=X+Y, where X and y are independent random
variables.
𝑋ത = 3, 𝑌ത = 2, 𝑋ത 2 = 1, 𝑌ത 2 = 1.5
Find fW(w) using central limit theorem.

You might also like