0% found this document useful (0 votes)
7 views18 pages

LN LinearTSModels 2

Uploaded by

sureitan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views18 pages

LN LinearTSModels 2

Uploaded by

sureitan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 18

HE4020 Econometric Time

Series Analysis
Semester 1, 2015-16
LINEAR TIME SERIES MODELS
1 Stationarity Through Differencing
Consider the monthly price of a barrel of crude oil from
January 1986 through January 2006.
60
50
Price per Barrel

40
30
20
10

1990 1995 2000 2005

Time

The series displays considerable variation, especially since


2001, and a stationary model does not seem to be reason-
able.
In many instances, differencing the original observations
can result in transforming a non-stationary to a stationary
series.
Suppose
3
Yt = Mt + et
(1)
Mt = Mt 1 + "t
where fetg and f"tg are independent white noise
series. Then

Yt = Mt + et
(2)
= "t + et et 1

which would have the autocorrelation function of an


M A(1) series with

1 = f1=[2 + ( 2" = 2e )]g (3)


Further, suppose

Yt = Mt + et
Mt = Mt 1 + Wt (4)
W t = W t 1 + "t
Here, the stochastic trend term Mt is such that its "rate
of change", changes slowly over time. Then

Yt = Mt + et
(5)
= W t + et
and
4
2
Yt = Wt + 2et
= "t + (et et 1) (et 1 et 2 ) (6)
= "t + et 2et 1 + et 2

which has the autocorrelation function of an M A(2)


process. Hence, the second difference of the
nonstationary process fYtg is stationary.
1.1 ARIMA Models
A time series fYtg is said to follow an integrated au-
toregressive moving average model if the dth difference
Wt = dYt is a stationary ARM A process.
If fWtg follows an ARM A(p; q) model, we say that
fYtg is an ARIM A(p; d; q) process.
In practice, d usually equals 1 or 2.
Consider an ARIM A(p; 1; q) process. With Wt = Yt
Yt 1; we have

Wt = 1 Wt 1 + + p W t p + et 1 et 1 q et q
(7)
In terms of the observed series,
5
Yt Yt 1 = 1 (Yt 1 Yt 2 ) + + p (Yt p Yt p 1)
+ et 1 et 1 q et q
(8)
which we may write as

Yt = (1 + 1)Yt 1 + ( 2 1 )Yt 2 + +( p p 1 )Yt p


p Y t p 1 + et 1 et 1 q et q
(9)
Notice that it appears to be an ARMA(p+1,q) process.
However, the characteristic polynomial satisfies

2 p p+1
1 (1 + 1)L ( 2 1 )L ( p p 1 )L + pL
2 p
= (1 1L 2L p L )(1 L)
(10)
which can be easily checked. This factorization clearly
shows that root L = 1 which implies nonstationarity.
The remaining roots, however, are the roots of the
characteristic polynomial of the stationary process Yt:
If the process contains no autoregressive terms, we call
it an integrated moving average and abbreviate the name
to IM A(d; q): If no moving average terms are present,
we denote the model as ARI(p; d):
6
1.2 The IMA(1,1) Model
In difference equation form, the model is

Y t = Y t 1 + et et 1 (11)
To write Yt explicitly as a function of present and past
noise values, repeatedly substitute for the Yt j ; j = 1; 2; :::;
on the right hand side of the equation:

Yt = et+(1 )et 1+(1 )et 2+ +(1 )e m e m 1


(12)
Note that the weights on the white noise terms do not die
out as we go further into the past.
We can derive the variance and correlations:

2
V ar(Yt) = [1 + + (1 )2(t + m)] 2
e (13)
r
t+m k
Corr(Yt; Yt k ) (14)
t+1
1

As t increases, V ar(Yt) increases and could be quite


large. Also, the correlation between Yt and Yt k will
7
be strongly positive for many lags k = 1; 2; :::
1.3 The IMA(2,2) Model
In difference equation form, we have

2
Yt = et 1 et 1 2 et 2 (15)
or

Yt = 2Yt 1 Yt 2 + et 1 et 1 2 et 2 (16)
Substituting for the Y 0s on the r.h.s. we can express Yt
in terms of et;et 1; : : :

P
t+m
Yt = et + j et j [(t + m + 1) 1 + (t + m) 2]e m 1
j=1
(t + m + 1) 2e m 2
(17)

where j = 1+ 2 +(1 1 2)j for j = 1; 2; :::t+m


Once again we see that the -weights do not die out but
form a linear function of j:
Variances and correlations for Yt can be obtained from
equation (17), but the calculations are tedious. We sim-
ply note that the variance of Yt increases rapidly with t
8
and Corr(Yt; Yt k ) is nearly 1 for all moderate k:
The plot below shows a simulated IMA(2,2) process
30
IMA(2,2) Simulation

20
5 10
0

0 10 20 30 40 50 60
Time

Notice the smooth change in process values.


The increasing variance and the strong, positive neigh-
bouring correlations dominate the appearance of the time
series plot.
Plot for the first difference of the simulated series is shown
here
This series is also nonstationary, it is governed by an
IM A(1; 2) model

9
4
First Diff erence
2
0
-2
-4

0 10 20 30 40 50 60
Time

A second differencing produces the following plot


4
Diff erenced T wice
2
0
-2
-4

10 20 30 40 50 60
Tim e

1.4 The ARI(1,1) Model


The ARI(1; 1) process satisfies

Yt Yt 1 = (Yt 1 Y t 2 ) + et (18)
or
10
Yt = (1 + )Yt 1 Yt 2 + et (19)

where j j < 1:
To find the weights, we can use a technique that will
generalize to arbitrary ARIM A models. It can be shown
that the weights can be obtained by equating like
powers of L in the identity:

p
(1 1L p L )(1 L)d(1 + + 2L2 + )
1L
2 q
= (1 1L 2L qL )
(20)
For the ARI(1,1) case, this relationship reduces to

2
(1 L)(1 L)(1 + 1L + 2L + )=1 (21)

or

[1 (1 + )L + L2](1 + 1L + 2L
2
+ ) = 1 (22)

Equating like powers of L on both sides, we obtain

(1 + ) + 1 =0 (23)
11
(1 + ) 1 + 2 =0 (24)
and, in general,

k = (1 + ) k 1 k 2 for k 2 (25)
with 0 = 1 and 1 = 1 + :
This recursion with starting values allows us to compute
as many weights as necessary.
It can be shown that in this case an explicit solution to
the recursion is given as

k+1
1
k = for k 1 (26)
1

2 Constant Terms in ARIMA Mod-


els
d
For an ARIM A(p; d; q) model, Yt = Wt is a station-
ary ARM A(p; q) process.
It is standard to assume that the stationary process has a
zero mean. However, if Wt is assumed to have a non-
zero mean, this can be allowed for in two different ways.
12
We can introduce a constant term 0 into the model as
follows:

Wt = + 1 Wt 1 +
0 + p Wt p
(27)
+et e
1 t 1 e
q t q
Alternatively, taking expectation on both sides, we find

= 0 +( 1 + + p) (28)
so that

0 = (1 1 p) (29)
Substituting for 0 in equation (27) and gathering terms,
we have

Wt = 1 (Wt 1 )+ + p (Wt p )
(30)
+et 1 et 1 q et q

Thus, we see that the non-zero mean ARM A(p; q) can


be represented as zero-mean ARM A(p; q) when the
Wt are expressed as deviations from mean.
What will be the effect of a non-zero mean for Wt on the
undifference series Yt?
13
Consider the IM A(1; 1) case with a constant term. We
have

Wt = 0 + et et 1 (31)
or

Yt = Yt 1 + 0 + et et 1 (32)
By iterating into the past, we can establish that

Yt = et + (1 )et 1 + + (1 )e m e m 1
+(t + m + 1) 0
(33)
Comparing this with equation (12), we see that there is
an added linear deterministic time trend (t + m + 1) 0 with
slope 0:
An equivalent representation of the process would then
be

Yt = Yt0 + 0 + 1t (34)
where Yt0 is an IM A(1; 1) series with E( Yt0) = 0
and E( Yt) = 1:
14
d
For a general ARIM A(p; d; q) model where E( Yt) 6=
0, it can be argued that

Yt = Yt0 + t (35)
where t is a deterministic polynomial of degree d and
Yt0 is ARIM A(p; d; q) with E( dYt0) = 0: With
d = 2 and 0 6= 0; a quadratic trend would be implied.

3 The Logarithm Transformation


The logarithm transformation can be useful in certain
circumstances.
We frequently encounter series where increased disper-
sion seems to be associated with higher levels of the se-
ries - the higher the level of the series, the more variation
there is around that level and conversely.
Suppose
p that Yt > 0 for all t and that E(Yt) = t and
V ar(Yt) = t :
By Taylor’s expansion, we have

(Yt t)
log(Yt) log( t) + (36)
t
15
Then,

E[log(Yt)] log( t) (37)

and

2
V ar[log(Yt)] (38)
In other words, if the standard deviation of the series is
proportional to the level of the series, then transforming
to logarithms will produce a series with approximately
constant variance over time.
Also, if the level of the series is changing roughly expo-
nentially, the log-transformed series will exhibit a linear
time trend. Thus, we might then want to take first differ-
ences.

3.1 Percentage Changes


The logarithm transformation may be useful in another
context.
Suppose Yt tends to have relatively stable percentage
changes from one time period to the next. Specifically,
assume that
16
Yt = (1 + Xt)Yt 1 (39)
where 100Xt is the percentage change (possibly
negative) from Yt 1 to Yt: Then

log(Yt) log(Yt 1) = log(1 + Xt) (40)


If Xt is restricted to, say, jXtj < 0:2 (i.e. the
percentage changes are at most 20%); then, to a
good approximation, log(1 + Xt) Xt: Consequently,

[log(Yt)] Xt (41)
will be relatively stable and perhaps well-modeled by
a stationary process.
Note that we take logs first then compute first differ-
ences. In finance literature, the differences of the (nat-
ural) logarithms are usually called returns.
As an example, the figure below shows a time series
plot of total monthly electricity generated in millions of
kilowatt-hours.
Note that the higher values display considerably more
variation than the lower values.
17
electricity

300000
150000

1975 1980 1985 1990 1995 2000 2005


Time

Taking the log of the series, we see that the transformed


series exhibit much more uniform variation across high
and low values
12.8
Log(electr icity)

12.4
12.0

1975 1980 1985 1990 1995 2000 2005


Time

Next, the first difference of the log(electricity) seems sta-


tionary

18
Diff of Log(electr icity)

0.0
-0.2

1975 1980 1985 1990 1995 2000 2005


Time

19

You might also like