0% found this document useful (0 votes)
19 views

Courses Types of Regression

Uploaded by

Hemal Pandya
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
19 views

Courses Types of Regression

Uploaded by

Hemal Pandya
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 7

 Courses 

 Blog
 Corporate
 Student Login
1. Home 
 
2. Blogs 
 
3. Different Types of Regression Analysis – A Basic Guide

Blogs
Different Types of Regression Analysis – A Basic Guide

 
Ajay Sarangam
Content Writer Author
 10 Jun 2022

Share

  

INTRODUCTION
The term regression is used to indicate the estimation or prediction of the average value
of one variable for a specified value of another variable. And Regression Analysis is a
statistical tool used to estimate the relationship between a dependent variable and an
independent variable. For example, If a Manger of a firm wants the exact relationship
between advertisement expenditure and sales for future planning then the regression
technique will be most suitable for him. 
There are different types of regression analysis, let’s talk about it in more details:- 

1. Linear Regression
Linear regression is a type of model where the relationship between an independent
variable and a dependent variable is assumed to be linear. The estimate of variable “y” is
obtained from an equation, y’- y_bar = byx(x-x_bar)……(1) and estimate of variable
“x” is obtained through the equation x’-x_bar = bxy(y-y_bar)…..(2). The graphical
representation of linear equations on (1) & (2) is known as Regression lines. These lines
are obtained through the Method of Least Squares. 
There are two kinds of Linear Regression Model:-
 Simple Linear Regression: A linear regression model with one independent and one
dependent variable.
 Multiple Linear Regression: A linear regression model with more than one
independent variable and one dependent variable.
Assumptions of Linear Regression

 Sample size : n = 20 (cases per independent variable)


 Heteroscedasticity is absent —
 Linear Relationships exist between the variables.
 Independent Sample observations.
 No multicollinearity & auto-correlation
 Independent Sample observations.
2. Polynomial Regression
It is a type of Regression analysis that models the relationship of values of the Dependent
variable “x” and Independent variables “y’’ as non-linear. It is a special case of Multiple
Linear Regression even though it fits a non-linear model to data. It is because data may
be correlated but the relationship between two variables might not look linear. 

3. Logistic Regression
Logistic Regression is a method that was used first in the field of Biology in the 20th
century. It is used to estimate the probability of certain events that are mutually exclusive,
for example, happy/sad, normal/abnormal, or pass/fail. The value of probability strictly
ranges between 0 and 1.

4. Quantile Regression
Quantile Regression is an econometric technique that is used when the necessary
conditions to use Linear Regression are not duly met. It is an extension of Linear
Regression analysis i.e., we can use it when outliers are present in data as its estimates
strong against outliers as compared to linear regression.

5. Ridge Regression
To understand Ridge Regression we first need to get through the concept of
Regularization. 

Regularization: There are two types of Regularization, L1 regularization & L2


regularization. L1 regularization adds an L1 penalty equal to the value of coefficients to
restrict the size of coefficients, which leads to the removal of some coefficients. On the
other hand, L2 regularization adds a penalty L2 which is equal to the square of
coefficients. 

Using the above method Regularization solves the problem of a scenario where the model
performs well on training data but underperforms on validation data.

6. Lasso Regression 
LASSO (Least Absolute Shrinkage and Selection Operator) is a regression technique that
was introduced first in geophysics. The term “Lasso” was coined by Professor Robert
Tibshirani. Just like Ridge Regression, it uses regularization to estimate the results. Plus it
also uses variable selection to make the model more efficient.

7. Elastic Net Regression 


Elastic net regression is favoured over ridge and lasso regression when one has to deal
with exceedingly correlated independent variables.

8. Principle components regression (PCR)


Principle components regression technique which is broadly used when one has various
independent variables. The technique is used for assuming the unknown regression
coefficient in a standard linear regression model. The technique is divided into two steps, 

1. Obtaining the principal components

2. Go through the regression Analysis on Principle components.

9. Partial least regression (PCR)


It is a substitute technique of principal components regression when one has a widely
correlated independent variable. The technique is helpful when one has many
independent variables. Partial least regression is widely used in the chemical, drug, food,
and plastic industry.

10. Support Vector Regression


Support vector regression can be used to solve both linear and nonlinear models. Support
vector regression has been determined to be productive to be an effective real-value
function estimation.
11. Ordinal Regression 
Ordinal regression is used to foreshow ranked values. The technique is useful when the
dependent variable is ordinal. Two examples of Ordinal regression are Ordered Logit and
ordered probit.

12. Poisson Regression 


Poisson Regression is used to foreshow the number of calls related to a particular product
on customer care. Poisson regression is used when the dependent variable has a
calculation. Poisson regression is also known as the log-linear model when it is used to
model contingency tablets. Its dependent variable y has Poisson distribution.

13. Negative Binomial Regression


Similar to Poisson regression, negative Binomial regression also accord with count data,
the only difference is that the Negative Binomial regression does not predict the
distribution of count that has variance equal to its mean.

14. Quasi Poisson Regression


Quasi Poisson Regression is a substitute for negative Binomial regression. The technique
can be used for overdispersed count data.

15. Cox Regression 


Cox Regression is useful for obtaining time-to-event data. It shows the effect of variables
on time for a specific period. Cox Regression is also known as proportional Hazards
Regression.

16. Tobit Regression


Tobit Regression is used to Evaluate linear relationships between variables when
censoring ( observing independent variable for all observation) exists in the dependent
variable. The value of the dependent is reported as a single value.

APPLICATION OF REGRESSION ANALYSIS


 Forecasting: Different types of regression analysis can be used to forecast future
opportunities and threats for a business. For instance, a customer’s likely purchase
volume can be predicted using a demand analysis.
However, when it comes to business, demand isn’t the only variable that affects
profitability. 
 Comparison with competition: A company’s financial performance can be
compared to that of a specific competitor using this tool. Also, it can be used to
determine the correlation between the stock prices of two different companies within
the same industry or different industries. 
When compared to a rival company, it can help identify which factors are influencing its
sales. It can help small businesses achieve rapid success in a short term.
 Problem Identification: In addition to providing factual evidence, a regression can
be used to identify and correct judgment errors. For example, a retail shop owner may
believe that extending the hours of operation will result in a significant increase in
sales. However, regression analysis shows that the monetary gains as a result of
increasing the working hours are not enough to offset the increase in operational costs
that comes along with it.
Regression analysis may provide the business owners with quantitative support for their
decisions and prevent them from making mistakes because of their intuition.
 Decision Making: Regression analysis (and other types of statistical analysis) are
now being used by many businesses and their top executives to make better business
decisions and reduce guesswork and intuition.
Scientific management is made possible by regression. Data overload is a problem for
both small and large organizations. To make the best decisions possible, managers can
use regression analysis to sort through data and select relevant factors.
CONCLUSION
The types of regression analysis are listed above but choosing a correct regression model
is a tough grind. It requires vast knowledge about statistical tools and their application.
The correct method was chosen based on the nature of the variable, data, and the model
itself. Overall the different types of Regression Analysis have calculated discrete and
distinct data very easily in the recent, not only in the field of mathematics/statistics but it
has many applications in the real world as well. Hence, Regression analysis is a boon for
mankind.

You might also like