0% found this document useful (0 votes)
25 views8 pages

Linear Regression Appendix

The document discusses linear regression, including the identification of linear functions, slope, and offset calculations. It covers concepts such as likelihood, feature transformation, and engineering, as well as techniques for handling outliers and co-linearity. Additionally, it introduces random feature mapping and its application in linear regression with regularization.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
25 views8 pages

Linear Regression Appendix

The document discusses linear regression, including the identification of linear functions, slope, and offset calculations. It covers concepts such as likelihood, feature transformation, and engineering, as well as techniques for handling outliers and co-linearity. Additionally, it introduces random feature mapping and its application in linear regression with regularization.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

9.

Linear Regression
Appendix
[Link]@[Link]
2025-03-18
Which of the functions is linear? X
0
Y1
2.8
Y2
3.1
Y3
2
1 5.4 4.5 8.1
2 8.3 7.1 14.1
3 11.4 12.4 20
4 14.9 19.5 26
What’s roughly 5 17.9 28.5 32.1
the slope? 6 20.2 40 38
7 23.6 52.6 44.1
8 26.8 68 50
What’s roughly the offset / bias? 9 29.1 85 56
10 32.9 104 62.1
11 35.1 124.4 68
12 38.1 147.3 74
y1=x * 3 + 2 13 41.9 172.1 80
14 44.8 199.8 86
15 47.4 228.3 92
16 50.1 259.7 98
y3= x * 6 + 2 17 53.5 292.4 104.1
18 56.2 327.6 110
19 59.7 364.9 116.1
Likelihood

“... how well a statistical model


explains observed data
by calculating the probability of seeing that data
under different parameter values of the model.”

From: [Link]
Regression with Feature Transformation

From: [Link]

x
Regression with Feature Engineering

Fitting:
y = w0* log(x) + w1

log(x)
Regression with Random Feature Mapping /
Random Kitchen Sinks

This is the trick – random mapping

This is just linear regression (+ regularization) on the


randomly mapped features.
y = (X1 > 0) ^ (X2 > 0)
Overview
● Intuition behind linear functions
● What error to fit?
● Maximum Likelihood Estimation (MLE)
● Minimization of MLE
● Multi-label Regression
● Feature Engineering
○ Log, reciprocal
○ Random Basis Functions
● What to do with outliers?
○ Least Absolute Error
● Co-linearity & Sparsity
○ Effect of 𝛌

You might also like