Lecture 2
Lecture 2
Dr Mabikwa
BIUST
E (aX + bY + c) = aE (X ) + bE (Y ) + c .
V (X ) = E {X − µX }2
V (X + Y ) = V (X ) + V (Y )
4 V (X ) = E (X 2 ) − µ2X
p
The standard deviation of X is σX = + V (X ). We sometimes write
σX2 = V (X ).
1
if s > 0
sgn(s) = 0 if s = 0
−1 if s < 0
Sample Correlation
Cov (X , Y )
Corr (X , Y ) =
sd(X )sd(Y )
Example
x <- c(1,0,3,4,5)
y <- c(0,2,1,5,7)
corr(x,y)
## [1] 0.7856876
cov(x,y)
## [1] 4.75
Dr Mabikwa (BIUST) STAT 411 - Time Series February 13, 2023 8 / 43
Auto-correlation
E (Yt − µ)(Yt−1 − µ) γk
ρk = = 2
σ2 σ
ck
These can be estimated from sample equivalence rk = c0
−1 2
±√
n n
At kag zero rk is always equal to 1
The correlogram for wave heights has a well-defined shape that
appears like a sampled damped cosine function (Typical AR(2)).
**covered later
Dr Mabikwa (BIUST) STAT 411 - Time Series February 13, 2023 12 / 43
Backshift operator
Definition: The backward shift operator B is defined by
BYt = Yt−1
B n Yt = Yt−n
a If all roots are greater than unity in absolute value implies stationary
b Random walk has roots B = 1 is non stationary
Difference Operator
Differencing adjacent terms of a series can transform a non-stationary
series to a stationary series. For example: if Yt is a random walk it
is non stationary but first differencing R1 makes it stationary
et = Yt − Yt−1 .
thus differencing is useful filtering procedure
in general therefore
∇n = (1 − B)n
- Proof (exercise)
Dr Mabikwa (BIUST) STAT 411 - Time Series February 13, 2023 14 / 43
Stochastic Processes
The mean function is the expected value of the process at time t defined
by
µt = E (Yt ) , t = 0, ±1, ±2, . . .
The autocovariance function (ACVF) is defined as
Figure: Simulated Gaussian white noise series with mean=4 and sd=2
e1 , e2 , . . .
be i.i.d random variables with mean 0 and variance σe2 . Now construct a
time series {Yt } as Y1 = e1
Y2 = e1 + e2
..
.
Yt = e1 + e2 + · · · + et
..
.
Yt = Yt−1 + et , where Y0 = 0 with probability 1.
et is the step size at time t by a random walker.
Yt is the position of the random walker at time t whose initial
position is the origin.
Dr Mabikwa (BIUST) STAT 411 - Time Series February 13, 2023 23 / 43
Yt − Yt−1 = diff (Y ) iid White Noise
Yt = c + Yt−1 + ϵt
Import stock data as provide by the code below and interpret your results.
where et is white noise and the αk are the model parameters with αp ̸= 0
for an order p process. This can be expressed as a polynomial of order p in
terms of the backward shift operator:
Yt = αYt−1 + et
where et is a white noise series with mean zero and variance σ 2 . Thus, it
can be shown that the second-order properties of AR(1) follow as
µx = 0
αk σ 2
γk =
(1 − α2 )
(1 − αB)Yt = et
∞
X
−1 2
⇒ Yt = (1 − αB) et = et + αet−1 + α et−2 + ... = αi et−i
i=0
ρk = αk ; (k ≥ 0)
where |α| < 1. Thus, the correlogram decays to zero more rapidly for
smaller α.
set.seed(8)
x <- w <- rnorm(100)
for (t in 2:100) x[t] <- 0.7 * x[t - 1] + w[t]
layout(1:3)
plot(x, type = "l", xlab="Time Series plot")
acf(x)
pacf(x)