VISVESVARAYA TECHNOLOGICAL UNIVERSITY
“JNANA SANGAMA”, BELAGAVI - 590018
K.R PET KRISHNA GOVERNMENT ENGINEERING COLLEGE, K.R PET -571426
Department of Computer Science & Engineering
“MULTILAYER PERCEPTRON”
Presented by
ARATHI S 4GK21CS005
Under the Supervision of
Course coordinator Head of the Department
Dr Devika G Dr Hareesh K
Assistant Professor Department of CSE
CONTENTS:
• INTRODUCTION
• ALGORITHM
• EXAMPLE PROBLEM
• APPLICATIONS
• ADVANTAGES
• DISADVANTAGES
LINEAR REGRESSION:
• Linear regression is a widely used statistical technique for modeling the
relationship between a dependent variable y and one or more independent
variables x1,x2,...,xn. It assumes that there is a linear relationship between
the independent variables and the dependent variable. Linear regression
aims to find the best-fitting straight line (or hyperplane in higher
dimensions) that minimizes the difference between the observed values of
the dependent variable and the values predicted by the linear model.
ALGORITHM STEPS:
• Initialization: Start with initial estimates of the coefficients
• Predicted Values: Compute the predicted values of the dependent variable y using the linear
model.
• Residuals: Calculate the residuals (errors) by subtracting the predicted values from the observed
values.
• Cost Function: Compute the cost function (e.g., sum of squared residuals) to measure the
goodness of fit.
• Gradient Descent (Optional): If using gradient descent optimization, update the coefficients
iteratively to minimize the cost function.
• Convergence Check: Repeat steps 2-5 until convergence criteria are met (e.g., small change in
the cost function or maximum number of iterations reached).
EXAMPLE:
• Consider the following set of points: {(-2 , -1) , (1 , 1) , (3 , 2)}
a) Find the least square regression line for the given data points.
b) Plot the given points and the regression line in the same rectangular
system of axes.
• a) Let us organize the data in a table.
x y xy x2
-2 -1 2 4
1 1 1 1
3 2 6 9
Σx = 2 Σy = 2 Σxy = 9 Σx2 = 14
We now use the above formula to calculate a and b as follows
a = (n Σx y – Σx Σy) / (nΣx2 - (Σx)2) = (3*9 - 2*2) / (3*14 - 22) =
23/38
b = (1/n)(Σy - a Σx) = (1/3)(2 - (23/38)*2) = 5/19
b) We now graph the regression line given by y = a x + b and the
given points.
APPLICATIONS:
• Predictive modeling (e.g., predicting house prices, stock prices).
• Forecasting (e.g., sales forecasting, demand forecasting).
• Trend analysis (e.g., analyzing the relationship between variables over
time).
• Risk assessment (e.g., predicting the likelihood of events based on risk
factors).
ADVANTAGES:
• Interpretability: Linear regression provides easily interpretable
coefficients that represent the relationship between independent and
dependent variables.
• Simple and Efficient: Linear regression is computationally efficient and
easy to implement.
• No Assumptions about Data Distribution: Linear regression does not
require assumptions about the distribution of the independent variables.
• Baseline Model: Linear regression serves as a baseline model for more
complex machine learning algorithms.
DISADVANTAGES
• Linear Assumption: Linear regression assumes a linear relationship between
independent and dependent variables, which may not always hold true.
• Sensitive to Outliers: Linear regression is sensitive to outliers, which can
significantly affect the estimated coefficients.
• Multicollinearity: If independent variables are highly correlated
(multicollinearity), linear regression may produce unstable and unreliable
estimates of coefficients.
• Underfitting: Linear regression may underfit complex relationships in the data if
the true relationship is nonlinear.