20. Hyperparameter_Tuning
20. Hyperparameter_Tuning
It is the main part when you build your machine learning model. As all you know in ml model there are
overfiting concept so to reduce this overfit we use hyperparameter tuning.
If we train a linear regression with SGD,parameters of a model are the slope and the bias and
hyperparameter is learning rate.
Some examples of model hyperparameters include:
Here i use CV(Cross validation) term in end of every hyperparameter approach because here cross
validation also performed.
1. GridSearchCV
In GridSearchCV approach, machine learning model is evaluated for a range of hyperparameter values.
This approach is called GridSearchCV, because it searches for best set of hyperparameters from a grid of
hyperparameters values.
In Grid Search, we try every combination of a preset list of values of the hyper-parameters and evaluate
the model for each combination.
Implementation
It Include 4 steps:
2. Instantiating Ml Model
# Necessary imports
from sklearn.linear_model import LogisticRegression
from sklearn.model_selection import GridSearchCV
import numpy as np
Drawbacks Of Gridsearch Cv
GridSearchCV will go through all the intermediate combinations of hyperparameters which makes grid
search computationally very expensive.
Now i dicuss about Random Search Cv which is faster than grid search cv
2. Random Search Cv
RandomizedSearchCV solves the drawbacks of GridSearchCV, as it goes through only a fixed number of
hyperparameter settings. It moves within the grid in random fashion to find the best set hyperparameters.
Random search is a technique where random combinations of the hyperparameters are used to find the
best solution for the built model.
Implementation
It Include 4 steps:
2. Instantiating Ml Model
# Necessary imports
from sklearn.tree import DecisionTreeClassifier
from sklearn.model_selection import RandomizedSearchCV