0% found this document useful (0 votes)
5 views

MultipleRegression

The document provides an overview of multiple regression analysis, including the formulation of the regression model, estimation processes, and the least squares method for calculating coefficients. It discusses the significance testing using F and t tests, assumptions about the error term, and the interpretation of regression coefficients. Additionally, it highlights the importance of addressing multicollinearity and using qualitative independent variables in regression models.

Uploaded by

laxman.22bce8268
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views

MultipleRegression

The document provides an overview of multiple regression analysis, including the formulation of the regression model, estimation processes, and the least squares method for calculating coefficients. It discusses the significance testing using F and t tests, assumptions about the error term, and the interpretation of regression coefficients. Additionally, it highlights the importance of addressing multicollinearity and using qualitative independent variables in regression models.

Uploaded by

laxman.22bce8268
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 43

Multiple Regression

 Multiple Regression Model


 Least Squares Method
 Multiple Coefficient of
 Determination
Model Assumptions
 Testing for Significance
 Using the Estimated Regression
Equation
 for Estimation
Qualitative and Prediction
Independent
 VariablesAnalysis
Residual
 Logistic
Regression
Multiple Regression Model
 Multiple Regression Model
The equation that describes how the dependent variable
y is related to the independent variables x1, x2, . . . xp and
an error term is:

y = 0 + 1x1 + 2x2 + . . . + pxp + 

where:
0, 1, 2, . . . , p are the parameters, and
 is a random variable called the error term
Multiple Regression Equation

 Multiple Regression Equation


The equation that describes how the
mean value of y is related to x1, x2, . . . xp
is:
E(y) = 0 + 1x1 + 2x2 + . . . + pxp
Estimated Multiple Regression Equation

 Estimated Multiple Regression Equation

y =^b0 + b1x1 + b2x2 + . . . + bpxp

A simple random sample is used to compute


sample statistics b0, b1, b2, . . . , bp that are
used as the point estimators of the parameters
 0,  1,  2, . . . ,  p .
Estimation Process
Multiple Regression Model
Sample Data:
E(y) = 0 + 1x1 + 2x2 +. . .+ pxp +  x1 x 2 . . . x p y
Multiple Regression Equation . . . .
E( y ) =  0 + 1x1 + 2x2 +. . .+ pxp . . . .
Unknown parameters are
 0,  1,  2, . . . ,  p

Estimated Multiple
Regression Equation
b0 , b1 , b2 , . . . , bp
provide estimates of yˆ  b0  b1x1  b2x2  ...  bpxp
 0,  1,  2, . . . ,  p Sample statistics are
b0 , b1 , b2 , . . . , bp
05/21/2
5
Least Squares Method
 Least Squares Criterion

min  (yi  yˆi )2

 Computation of Coefficient Values


The formulas for the regression coefficients
b0, b1, b2, . . . bp involve the use of matrix algebra.
We will rely on computer software packages to
perform the calculations.
Example
 You are a real estate professional who wants to create a
model to help predict the best time to sell homes. You'd
like to sell homes at the maximum sales price, but
multiple factors can affect the sales price. These
variables include the age of the house, the value of
other homes in the neighborhood, quantitative
measurements of the public school system regarding
student performance and the number of nearby parks,
among other factors.
 You can build a prediction model off these four
independent variables to predict the maximum sales
price of homes. You can adjust the variables if any of
these factors change in terms of their coefficient values.

05/21/2
5
Multiple Regression Model

 Example: Programmer Salary Survey


A software firm collected data for a sample
of 20 computer programmers. A suggestion
was made that regression analysis could
be used to determine if salary was related
to the years of experience and the score
on the firm’s programmer aptitude test.
The years of experience, score on the
aptitude
test, and corresponding annual salary
($1000s) for a
sample of 20 programmers is shown on the
next
slide.
Multiple Regression Model

Exper. Score Salary Exper. Score Salary


4 78 24.0 9 88 38.0
7 100 43.0 2 73 26.6
1 86 23.7 10 75 36.2
5 82 34.3 5 81 31.6
8 86 35.8 6 74 29.0
10 84 38.0 8 87 34.0
0 75 22.2 4 79 30.1
1 80 23.1 6 94 33.9
6 83 30.0 3 70 28.2
6 91 33.0 3 89 30.0
Multiple Regression Model

Suppose we believe that salary (y) is


related to the years of experience (x1) and the
score on
the programmer aptitude test (x2) by the
following
y = 0 + 1x1 + 2x2 + 
regression model:
where
y = annual salary ($1000)
x1 = years of experience
x2 = score on programmer aptitude test
Solving for the Estimates of 0, 1, 2

Least Squares
Input Data Output
x1 x2 Computer b0 =
y Package b1 =
for Solving
4 78 b2 =
Multiple
24
Regression R2 =
7 100
43 Problems etc.
. .
.
. .
.
05/21/2
5
Estimated Regression Equation

SALARY
SALARY =
= 3.174
3.174 +
+ 1.404(EXPER)
1.404(EXPER) +
+ 0.251(SCORE)
0.251(SCORE)

Note: Predicted salary will be in thousands of dollars


Interpreting the Coefficients

In multiple regression analysis, we


interpret each
regression coefficient as follows:
bi represents an estimate of the change in y
corresponding to a 1-unit increase in xi when all
other independent variables are held constant.
Interpreting the Coefficients

b
b11 =
= 1.404
1.404

Salary is expected to increase by $1,404 for


each additional year of experience (when the
variable
score on programmer attitude test is held
constant).
Interpreting the Coefficients

b
b22 =
= 0.251
0.251

Salary is expected to increase by $251 for


each
additional point scored on the programmer
aptitude
test (when the variable years of experience is
held
constant).
Multiple Coefficient of Determination

 Relationship Among SST, SSR, SSE

SST = SSR +
SSE

 i
( y  y )2
= i
( ˆ
y  y )2
+  i i
( y  ˆ
y )2

where:
SST = total sum of squares
SSR = sum of squares due to regression
SSE = sum of squares due to error
Multiple Coefficient of Determination

R2 = SSR/SST

R2 = 500.3285/599.7855 = .83418
Adjusted Multiple Coefficient
of Determination

n 1
Ra2 2
 1  (1  R )
n  p 1

2 20  1
R 1  (1  .834179)
a  .814671
20  2  1
Assumptions About the Error Term 

The error  is
The error is aa random
random variable
variable with
with mean
mean of
of zero.
zero.

The
The variance of  ,, denoted
variance of by 
denoted by 22,, is
is the
the same
same for
for all
all
values
values of
of the
the independent
independent variables.
variables.

The
The values of  are
values of are independent.
independent.

The error  is
The error is aa normally
normally distributed
distributed random
random variable
variable
reflecting
reflecting the
the deviation
deviation between
between thethe yy value
value and and the the
expected
expected value
value ofof yy given by 00 +
given by + 11xx11 +
+ 22xx22 + + pp
+ .. .. +
Testing for Significance

In
In simple
simple linear
linear regression,
regression, the
the FF and
and tt tests
tests provide
provide
the
the same
same conclusion.
conclusion.

In
In multiple
multiple regression,
regression, the
the FF and
and tt tests
tests have
have different
differen
purposes.
purposes.
Testing for Significance: F Test

The
The FF test
test is
is used
used toto determine
determine whether
whether aa significant
significant
relationship
relationship exists
exists between
between the
the dependent
dependent variable
variable
and
and the
the set
set ofof all
all the
the independent
independent variables
variables..

The
The FF test
test is
is referred
referred to
to as
as the
the test
test for
for overall
overall
significance
significance..
Testing for Significance: t Test

If
If the
the FF test
test shows
shows an
an overall
overall significance,
significance, the
the tt test
test is
is
used
used toto determine
determine whether
whether each
each of
of the
the individual
individual
independent
independent variables
variables is
is significant.
significant.

A
A separate
separate tt test
test is
is conducted
conducted for
for each
each of
of the
the
independent
independent variables
variables in
in the
the model.
model.

We
We refer
refer to
to each
each of
of these
these tt tests
tests as
as aa test
test for
for individua
individu
individu
significance
significance..
Testing for Significance: F Test

Hypotheses H0: 1 = 2 = . . . = p = 0
Ha: One or more of the parameters
is not equal to zero.

Test Statistics F = MSR/MSE

Rejection Rule Reject H0 if p-value <  or if F > F


where F is based on an F distribution
with p d.f. in the numerator and
n - p - 1 d.f. in the denominator.
Testing for Significance: t Test

Hypotheses H 0 :  i 0
H a :  i 0

bi
Test Statistics t 
sbi

Rejection Rule Reject H0 if p-value <  or


if t < -tor t > twhere t
is based on a t distribution
with n - p - 1 degrees of freedom.
Testing for Significance: Multicollinearity

The
The term
term multicollinearity
multicollinearity refers
refers to
to the
the correlation
correlation
among
among the
the independent
independent variables.
variables.

When
When the the independent
independent variables
variables are
are highly
highly correlated
correlated
(say,
(say, ||rr || >
> .7),
.7), it
it is
is not
not possible
possible to
to determine
determine the
the
separate
separate effect
effect of
of any
any particular
particular independent
independent variable
variable
on
on the
the dependent
dependent variable.
variable.
Testing for Significance: Multicollinearity

If
If the
the estimated
estimated regression
regression equation
equation is
is to
to be
be used
used only
onl
onl
for
for predictive
predictive purposes,
purposes, multicollinearity
multicollinearity is
is usually
usually
not
not aa serious
serious problem.
problem.

Every
Every attempt
attempt should
should be
be made
made toto avoid
avoid including
including
independent
independent variables
variables that
that are
are highly
highly correlated.
correlated.
Using the Estimated Regression Equation
for Estimation and Prediction

The
The procedures
procedures for
for estimating
estimating the
the mean
mean value
value of
of yy
and
and predicting
predicting an
an individual
individual value
value of
of yy in
in multiple
multiple
regression
regression are
are similar
similar to
to those
those in
in simple
simple regression.
regression.

We
We substitute
substitute the
the given
given values
values of
of xx11,, xx22,, .. .. .. ,, xxpp into
into
the
the estimated
estimated regression
regression equation
equation andand use use the the
corresponding
corresponding value
value of
of yy as
as the
the point
point estimate.
estimate.
Using the Estimated Regression Equation
for Estimation and Prediction

The
The formulas
formulas required
required toto develop
develop interval
interval estimates
estimates
^
for
for the
the mean value ^
mean value of
of yy and
and for
for an
an individual
individual value
value
of
of yy are
are beyond
beyond the
the scope
scope ofof the
the textbook.
textbook.

Software
Software packages
packages for
for multiple
multiple regression
regression will
will often
often
provide
provide these
these interval
interval estimates.
estimates.
Qualitative Independent Variables

In
In many
many situations
situations we
we must
must work
work with
with qualitative
qualitative
independent
independent variables
variables such
such as
as gender
gender (male,
(male, female)
female)
method
method of
of payment
payment (cash,
(cash, check,
check, credit
credit card),
card), etc.
etc.

For
For example,
example, xx22 might
might represent
represent gender
gender where
where xx22 =
=00
indicates
indicates male
male and
and xx22 =
=11 indicates
indicates female.
female.

In
In this
this case,
case, xx22 is
is called
called aa dummy
dummy or
or indicator
indicator variable
variable
Qualitative Independent Variables

 Example: Programmer Salary Survey


As an extension of the problem involving the
computer programmer salary survey, suppose
that management also believes that the
annual salary is related to whether the
individual has a graduate degree in
computer science or information systems.
The years of experience, the score on the programmer
aptitude test, whether the individual has a relevant
graduate degree, and the annual salary ($1000) for each
of the sampled 20 programmers are shown on the next
slide.
Qualitative Independent Variables

Exper. Score Degr. Salary Exper. Score Degr. Salary


4 78 No 24.0 9 88 Yes 38.0
7 100 Yes 43.0 2 73 No 26.6
1 86 No 23.7 10 75 Yes 36.2
5 82 Yes 34.3 5 81 No 31.6
8 86 Yes 35.8 6 74 No 29.0
10 84 Yes 38.0 8 87 Yes 34.0
0 75 No 22.2 4 79 No 30.1
1 80 No 23.1 6 94 Yes 33.9
6 83 No 30.0 3 70 No 28.2
6 91 Yes 33.0 3 89 No 30.0
Estimated Regression Equation

y = b0 + b1x1 + b2x2 + b3x3

where:
y =^annual salary ($1000)
x1 = years of experience
x2 = score on programmer aptitude test
x3 = 0 if individual does not have a graduate degree
1 if individual does have a graduate degree
x3 is a dummy variable
More Complex Qualitative Variables

IfIf aa qualitative
qualitative variable
variable has
has kk levels,
levels, kk -- 11 dummy
dummy
variables
variables are are required,
required, with
with each
each dummy
dummy variable
variable
being
being coded
coded asas 00 or
or 1.
1.

For
For example,
example, aa variable
variable with
with levels
levels A,
A, B,
B, and
and C C could
could
be
be represented
represented byby xx11 and
and xx22 values
values of
of (0,
(0, 0)
0) for
for A,
A, (1,
(1,
for
for B,
B, and
and (0,1)
(0,1) for
for C.
C.

Care
Care must
must be
be taken
taken in
in defining
defining and
and interpreting
interpreting the
the
dummy
dummy variables.
variables.
More Complex Qualitative Variables

For example, a variable indicating level of


education could be represented by x1 and x2
values as follows:

Highest
Degree x1 x2
Bachelor’s 0 0
Master’s 1 0
Ph.D. 0 1
Residual Analysis

 For simple linear regression the residual plot


against

and the residual plot against x provide the
 same information.
In multiple regression analysis it is preferable
to use the residual plotŷ against to determine
if the model assumptions are satisfied.
Standardized Residual Plot Against ŷ

 Standardized residuals are frequently used in


residual plots for purposes of:
• Identifying outliers (typically, standardized
residuals < -2 or > +2)
• Providing insight about the assumption that
the error term  has a normal distribution
 The computation of the standardized residuals
in multiple regression analysis is too complex
to be done by hand
 Excel’s Regression tool can be
used
Standardized Residual Plot Against ŷ

 Excel Value Worksheet


A B C D
28
29 RESIDUAL OUTPUT
30
31 Observation Predicted Y Residuals Standard Residuals
32 1 27.89626052 -3.89626052 -1.771706896
33 2 37.95204323 5.047956775 2.295406016
34 3 26.02901122 -2.32901122 -1.059047572
35 4 32.11201403 2.187985973 0.994920596
36 5 36.34250715 -0.54250715 -0.246688757
Note: Rows 37-51 are not shown.
Standardized Residual Plot Against ŷ

 Excel’s Standardized Residual


Plot Outlier
Standardized Residual Plot
3

2
Residuals
Standard

0
0 10 20 30 40 50
-1

-2
Predicted Salary
05/21/25
b0= 111.8

b1= -7.18

b2= 0.014

So, the Estimated multiple regression


model is

Qd

You might also like