ML Lab 07 Manual - Linear Regression 2 (Updated Version 4)
ML Lab 07 Manual - Linear Regression 2 (Updated Version 4)
Objectives
Lab Conduct
A major problem in the training is that the weights that are trained may fit the
model for only the data it is given. This means that the model will not
generalize to examples outside the dataset and is referred to as “overfitting”.
Such overfitting makes the machine learning implementation very impractical
for real-life applications where data has high variation. To prevent overfitting
of the model, a modification in the cost function and gradient descent is
implemented. This modification is called regularization and is itself controlled
by a hyperparameter (lambda).
cost_function(X, y, lambd)
The m is the number of the examples in the dataset and n is the total number of
features (or non-bias weights) in the hypothesis. Write the code for the cost
function and implement it for your training and cross-validation datasets to
print out the cost. Provide the code and all relevant screenshots of the final
output.
m
∂J 1
db= = ∑ (h(x ( i)) – y (i) )
∂ b m i=1
∂J
w j :=w j−α
∂wj
∂J
b :=b−α
∂wj
For the submission, you will need to run the gradient descent algorithm once to
update the weights. You will need to print the weights, training cost and
validation cost both before and after the weight update. Provide the code and
all relevant screenshots of the final output.