0% found this document useful (0 votes)
6 views9 pages

LInear Regression Module3

Uploaded by

ajmal_m1
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views9 pages

LInear Regression Module3

Uploaded by

ajmal_m1
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 9

Linear Regression using Least Squares Method

Ajmal Mohammed

Department of ECE

September 23, 2025

Ajmal Mohammed (Department of ECE) Least Squares Regression September 23, 2025 1/9
What is Regression?

Regression analysis is used to study the relationship between two


variables.
We assume a linear relation:

y = a + bx + ε

Where:
a = intercept
b = slope
ε = error term

Ajmal Mohammed (Department of ECE) Least Squares Regression September 23, 2025 2/9
Least Squares Objective

Given n observations (xi , yi ), predicted value is:

ŷi = a + bxi

Error (residual) is:

ei = yi − ŷi = yi − (a + bxi )

The Least Squares Method minimizes the sum of squared errors:


n
X
S= (yi − a − bxi )2
i=1

Ajmal Mohammed (Department of ECE) Least Squares Regression September 23, 2025 3/9
Derivation of Normal Equations
Step 1: Define error sum of squares
n
X
S= (yi − a − bxi )2
i=1

Step 2: Differentiate wrt a


∂S X
= −2 (yi − a − bxi ) = 0
∂a
X X
⇒ yi = na + b xi
Step 3: Differentiate wrt b
∂S X
= −2 xi (yi − a − bxi ) = 0
∂b
X X X
⇒ xi yi = a xi + b xi2

Ajmal Mohammed (Department of ECE) Least Squares Regression September 23, 2025 4/9
Normal Equations

The two equations obtained are:


X X
yi = na + b xi
X X X
xi yi = a xi + b xi2

Solve simultaneously to find a and b.


Regression line:
ŷ = a + bx

Ajmal Mohammed (Department of ECE) Least Squares Regression September 23, 2025 5/9
Direct Formulas for a and b

Using means x̄, ȳ : P


(xi − x̄)(yi − ȳ )
b=
(xi − x̄)2
P

a = ȳ − bx̄
Regression equation:
ŷ = a + bx

Ajmal Mohammed (Department of ECE) Least Squares Regression September 23, 2025 6/9
Example Problem
Data:
x 1 2 3 4 5
y 2 4 5 4 5
X X
yi = na + b xi
X X X
xi yi = a xi + b xi2
Step 1: Compute necessary sums
X X X X
x = 15, y = 20, x 2 = 55, xy = 66
Step 2: Normal equations
20 = 5a + 15b
66 = 15a + 55b
Step 3: Solve for a, b
a = 2.2, b = 0.6
Regression line: ŷ = 2.2 + 0.6x
Ajmal Mohammed (Department of ECE) Least Squares Regression September 23, 2025 7/9
Interpretation of Coefficients

a (intercept): expected value of y when x = 0.


b (slope): change in y for one unit increase in x.
Least Squares ensures minimum squared error between actual and
predicted values.

Ajmal Mohammed (Department of ECE) Least Squares Regression September 23, 2025 8/9
Summary

Least Squares minimizes:


X
(yi − (a + bxi ))2

Leads to normal equations to solve for a and b.


Direct formulas exist using means x̄, ȳ .
Provides the best fit line for the given data.

Ajmal Mohammed (Department of ECE) Least Squares Regression September 23, 2025 9/9

You might also like