0% found this document useful (0 votes)
3 views23 pages

2.2+Random+variables

This document covers the fundamentals of random variables in quantitative analysis, including definitions, types (discrete and continuous), and key concepts such as probability mass functions (PMF), probability density functions (PDF), and cumulative distribution functions (CDF). It also explains expected values, population moments (mean, variance, skewness, kurtosis), and the impact of linear transformations on these variables. The document aims to equip readers with the necessary tools to analyze and interpret random variables in a financial context.

Uploaded by

umamahes03
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views23 pages

2.2+Random+variables

This document covers the fundamentals of random variables in quantitative analysis, including definitions, types (discrete and continuous), and key concepts such as probability mass functions (PMF), probability density functions (PDF), and cumulative distribution functions (CDF). It also explains expected values, population moments (mean, variance, skewness, kurtosis), and the impact of linear transformations on these variables. The document aims to equip readers with the necessary tools to analyze and interpret random variables in a financial context.

Uploaded by

umamahes03
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 23

FRM Part 1

Book 2 – Quantitative Analysis

RANDOM VARIABLES
Learning Objectives
After completing this reading you should be
able to:
 Describe and distinguish a probability mass function from a
cumulative distribution function and explain the relationship between
these two.
 Understand and apply the concept of a mathematical expectation of a
random variable.
 Describe the four common population moments.
 Explain the differences between a probability mass function and a
probability density function.
 Characterize the quantile function and quantile-based estimators.
 Explain the effect of a linear transformation of a random variable on the
mean, variance, standard deviation, skewness, kurtosis, median, and
interquartile range.
Random Variables
 A random variable refers to any quantity with uncertain expected future
values.
 It can also be defined as a variable whose possible values are outcomes of
a random phenomenon.
o Example: the rate of return earned on a stock next year, the value at
risk of a portfolio, or the time of death of an insured in a life assurance
contract.
 An outcome refers to any possible value that a random variable can take.
o Example: A lottery ticket has two outcomes – a win or a loss; or
o The return earned by a mutual fund can take on any value around a
specific mean expectation.
 An event is a specified out outcome or a specified set of outcomes.
o For example, we could define event A as the event when the return
earned by a mutual fund is 10% or less, and event B as the event
when the return is more than 10%.
Types of Random Variables (1/2)
A random variable can either be discrete or continuous.
 A discrete random variable is one that produces a set of distinct values. A
discrete random variable manifests:
o If the range of all possible values is a finite set, e.g., {1,2,3,4,5,6} in the
case of a six-sided die or,
o If the range of all possible values is a countably infinite set:e.g. {1,2,3,
... }
 For example, suppose we rolled a dice. The set of possible outcomes
would be: {1 2 3 4 5 6}
 Each of these outcomes would occur with a probability of 1/6 (16.7%).
0.20

0.15

0.10

0.05

0.00
1 2 3 4 5 6
Types of Random Variables (2/2)
A random variable can either be discrete or continuous.
 A continuous random variable can assume any value along a given
interval of a number line.
 For instance, 𝑥 > 0, −∞ < 𝑥 < ∞ and 0 < 𝑥 < 1.
o Examples of continuous random variables are: Price of stock, amount
time is takes to run a marathon, etc.

 Thus it is not possible to talk about the probability of the random variable
assuming a particular value.
o Instead, we talk about the probability of the random variable assuming a
value within a given interval.
Probability Function
A probability function explains how the total chance (which is 1) is distributed
amongst the possible values of X.
 There are two key properties of a probability function:
i. The probability for a particular value or range of values must be
between 0 and 1, i.e., 0 < p(x) < 1
ii. Sum of all probabilities must equal 1.

 There are two types of probability functions:


I. Probability mass function
II. Probability density function

 We are going to look at each of these in detail…


Probability Mass Function
A probability mass function (PMF) is a probability function for discrete
random variables.
 It gives the probability that a random variable takes a particular value.

Example : Bernoulli Variable


 The graph of the Bernoulli PMF is shown below, assuming the p = success
= 0.7. Note that PMF is only defined for X=0,1.
 The PMF of a Bernoulli random variable X is given by

𝒇𝑿 𝒙 = 𝒑𝒙 𝟏 − 𝒑 𝟏−𝒙 , 𝑿 = 𝟎, 𝟏

0.7

0.3

0
0 1
Probability Density Function
A probability density function (PDF) is a probability function for continuous
random variables.
 It gives the probability that a random variable takes a value within some
specified interval.
 Given a PDF f(x), we can determine the probability that x falls
between a and b:
𝒃
𝑷𝒓 𝒂 < 𝒙 ≤ 𝒃 = න 𝒇 𝒙 𝒅𝒙
𝒂
 The probability of the random variable assuming a value within some given
interval from a to b is defined to be the area under the graph of the
probability density function between a and b.
f (x) Uniform f (x) Normal f (x) Exponential

a b a b a xb1 x2
Cumulative Distribution Function
A cumulative distribution function (CDF), gives the probability that the
random variable X is less than or equal to x, for every value x, i.e.
𝑭(𝒙) = 𝑷(𝑿 ≤ 𝒙)
 Both discrete and continuous distributions have cumulative distributions.
 For instance, the CDF of the Bernoulli random variable is:

𝑃(𝑋 ≤ 0)
1.0
𝟎, 𝒙 < 𝟎
𝑭𝑿 𝒙 = ቐ𝟏 − 𝒑, 𝟎 ≤ 𝒙 < 𝟎 0.7
𝟏, 𝒙 > 𝟎
𝑃(𝑋 ≤ 0)
0.3
The graph shows the
cumulative Bernoulli 0
distribution with p = 0.7. 0 1
CDF for Continuous Variables
 To determine the CDF of a continuous random variable, its PDF is
integrated from its lower bound.
𝒂
𝑭 𝒂 =න 𝒇 𝒙 𝒅 𝒙 =𝑷𝑿≤𝒂
−∞

f (x)
Uniform f (x) Exponential

Normal
f (x)

a a x1 x2

a
Relationship between CDF and PMF
with Discrete Random Variables
 The CDF can be represented as the sum of the PMF for all the values that
are less than or equal to x. Simply put:

𝑭𝑿 𝒙 = ෍ 𝒇𝑿 (𝒕)
𝒕𝝐𝑹 𝒙 ,𝒕≤𝒙
o Where R(x) is the range of realized values of X (X=x).

 On the other hand, PMF is equivalent to the difference between the


consecutive values of X. That is:

𝒇𝑿 𝒙 = 𝑭𝑿 𝒙 − 𝑭𝑿 𝒙 − 𝟏
Expected Value (1/2)
The expected value of a random variable (denoted as E(X)) is the probability-
weighted average of the possible outcomes of the random variable.
 For a discrete random variable, the expected value is simply the sum of
the product of the value of the random variable and the probability assumed
by the corresponding random variable.
𝐧

𝐄 𝐗 = 𝐏 𝐗𝟏 𝐗𝟏 + 𝐏 𝐗𝟐 𝐗𝟐 + ⋯ + 𝐏 𝐗𝐧 𝐗𝐧 = ෍ 𝐏 𝐗𝐢 𝐗𝐢
𝐢=𝟏
 For a continuous random variable, the expected value is calculated by
integrating the product of the value of the random variable and the
probability assumed by the corresponding random variable.

𝐄 𝐗 = න 𝐱𝐟(𝐱)𝐝𝐱
−∞
Example >>
Expected Value (2/2)
Example
 There are 8 different positions with different values in a bank’s trading
book. Positions 1 to 3 are each worth $1m, 4 and 5 are worth $2m, and the
rest are worth $3m. Determine the mean portfolio value.
Solution
 The PDF can be represented as:
3
,𝑥 = 1
8
1
𝑓 𝑥 = ,𝑥 = 2
4
3
,𝑥 = 3
8
 Now,
3 1 3
𝐸 𝑋 = ෍ 𝑥𝑓(𝑋) = $1𝑚 × + $2𝑚 × + $3𝑚 × = $2𝑚
8 4 8
𝑥
 So, the mean value is $2m.
Population Moments
Mean
 The mean is the first moment and given by:
𝝁 = 𝑬(𝑿)
o It is the average value of X.

Variance and Standard Deviation


 Variance is the second moment and presented as:
𝝈𝟐 = 𝑬 𝑿 − 𝑬 𝑿 𝟐 =𝑬 𝑿−𝝁 𝟐

o The variance measures the spread of the random variable from its
mean.

 The standard deviation (𝜎) is the square root of the variance.


o The standard deviation is more commonly quoted in the world of
finance because it is measured in the same units as X, while the
variance is in X-units squared.
Skewness
 Skewness is the third moment of the distribution.
𝟑 𝟑
𝑬 𝑿−𝑬 𝑿 𝑿−𝝁
𝒔𝒌𝒆𝒘 𝑿 = =𝑬
𝝈𝟑 𝝈
o It measures the asymmetry of the distribution.
Kurtosis
 Kurtosis is the degree fourth degree of the distribution.
𝟒 𝟒
𝑬 𝑿−𝑬 𝑿 𝑿−𝝁
𝑲𝒖𝒓𝒕 𝑿 = =𝑬
𝝈𝟒 𝝈
o It represents the peakedness of a distribution, usually taken in relation
to a normal distribution.

 A distribution that’s more


peaked than normal is called
leptokurtic.
 A distribution that’s less
peaked than normal is called
platykurtic.
 A normal curve itself is called
mesokurtic, which is neither
too peaked nor too flat-
topped.
Quantiles
 For a continuous variable X, the α-quartile of X is the smallest number m
such that:
𝑷𝒓 𝑿 < 𝒎 = 𝜶
o Where 𝛼𝜖[0,1]

 For instance, if X is a continuous random variable, the median is defined to


be the solution of:
𝒎
𝑷 𝑿 < 𝒎 = න 𝒇𝑿 𝒙 𝒅𝒙 = 𝟎. 𝟓
−∞
o Similarly, the lower and upper quartile is such that 𝑃 𝑋 < 𝑄1 = 0.25
and 𝑃 𝑋 < 𝑄3 = 0.75.

 The interquartile range (IQR), which is an alternative measure of spread. It


is given by:
𝑰𝑸𝑹 = 𝑸𝟑 − 𝑸𝟏
Mode
 The mode measures the common tendency, that is, the location of the
most observed value of a random variable.
o In case of continuous random variable, the mode is represented by the
highest point in the PDF.
 Random variables can be unimodal if there’s just one mode, bimodal if
there are two modes, or multimodal if there are more than two modes.

Unimodal Bimodal
4.5 4.5
4 4
3.5 3.5
3 3
2.5 2.5
2 2
1.5 1.5
1 1
0.5 0.5
0 0
1 2 3 4 5 6 1 2 3 4 5 6
Standardizing Random Variables
 When 𝑋 has mean 𝜇 and standard deviation σ, a standardized version of
𝑋 can be constructed.

 A z-score which is the standard version of X, indicates distance from the


mean in standard deviation units. Formula:
𝑿− 𝝁
𝒛=
𝛔
o Converting to standard or z-scores does not change the shape of the
distribution.

 We will learn more about z-scores in future chapters.


Linear Transformation of
Random Variables (1/2)
 A linear transformation is a change to a variable characterized by one or
more of the major math operations:
o adding a constant to the variable,
o subtracting a constant from the variable,
o multiplying the variable by a constant,
o and/or dividing the variable by a constant.

 Transformation results in the formation of a new random variable.


o Linear transformation is informed by the fact that many variables used in
finance and risk management do not have a natural scale.

Example >>
Linear Transformation of
Random Variables (2/2)
Example
 Suppose your salary is 𝜶 dollars per year, and you are entitled to a bonus
of 𝜷 dollars for every dollar of sales you successfully bring in.
o Let x be what you sell in a certain year.
 How much in total do you make?
Solution
 We can linearly transform the sales variable X into a new variable Y that
represents the total amount made.

Y = 𝛼 + 𝛽x

Total amount Shift Scale


made (new constant constant
variable)
How does Linear Transformation
Affect Moments?
 If 𝑌 = 𝛼 + 𝛽𝑥 where α and β constants, the mean of Y is given by:
𝑬 𝒀 = 𝑬 𝜶 + 𝜷𝒙 = 𝜶 + 𝜷𝑬(𝑿)
 The variance is given by:
𝑽𝒂𝒓 𝒀 = 𝑽𝒂𝒓 𝜶 + 𝜷𝒙 = 𝜷𝟐 𝑽𝒂𝒓 𝑿 = 𝜷𝟐 𝝈𝟐
 The shift parameter α does not affect the variance.
o Variance is a measure of spread from the mean; adding 𝛼 does not
change the spread but merely shifts the distribution to the left or to the
right.
 The standard deviation of Y is given by:
𝜷𝟐 𝝈𝟐 = |𝜷|𝝈
 If β is positive, then the skewness and kurtosis of 𝑌 are identical to the
skewness and kurtosis of X.
 If β is negative, there is no effect on kurtosis, and the skewness has the
same magnitude but the opposite sign.
Book 2 – Quantitative Analysis

RANDOM VARIABLES
Learning Objectives Recap
 Describe and distinguish a probability mass function from a
cumulative distribution function and explain the relationship between
these two.
 Understand and apply the concept of a mathematical expectation of a
random variable.
 Describe the four common population moments.
 Explain the differences between a probability mass function and a
probability density function.
 Characterize the quantile function and quantile-based estimators.
 Explain the effect of a linear transformation of a random variable on the
mean, variance, standard deviation, skewness, kurtosis, median, and
interquartile range.

You might also like