Support Vector Machine(With Numerical Example) _ by Balaji C _ Medium (1)
Support Vector Machine(With Numerical Example) _ by Balaji C _ Medium (1)
Member-only story
17 1
Main goal of SVM is to create best fit line or boundary that can segregate n-
dimensional space into classes, so that we can put the data point in correct
category for future prediction.
1. Linear SVM: Linear SVM is used for linearly separable data, which means
if a dataset can be classified into two classes by using a single straight
line, then such data is termed as linearly separable data, and classifier is
used called as Linear SVM classifier.
Solution: for all negative labelled output is -1 and for all positive labelled
output is 1.
graph for above table
***************************************************************
from the graph we can see there is one negative point α₁ = (1,0,1) and two
positive points α₂ = (3,1,1) & α₃ = (3,-1,1) form support vectors
Generalized equation
α₁*s₁`*s₁`+ α₂*s₁`*s₂`+α₃*s₁`*s₃` = -1 → 1
α₁*s₂`*s₁`+ α₂*s₂`*s₂`+α₃*s₂`*s₃` = 1 → 2
α₁*s₃`*s₁`+ α₂*s₃`*s₂`+α₃*s₃`*s₃` = 1 → 3
*****************************************************************
****************************************************************************
****************************************************************************
To find hyper-plane
W` = Σ αᵢ * sᵢ
W` = (1,0,-2)
y = W`x + b
W` = (1,0) and b = 2
so the best fit line or hyper plane is at (0,2) which splits the data points into
two classes
import numpy as np
import matplotlib.pyplot as plt
from sklearn import svm, datasets
iris = datasets.load_iris()
X = iris.data[:, :2]
y = iris.target
plt.subplot(1, 1, 1)
Z = svc.predict(np.c_[xx.ravel(), yy.ravel()])
Z = Z.reshape(xx.shape)
plt.contourf(xx, yy, Z, cmap=plt.cm.Paired, alpha=0.8)
In non-linear SVM we will use Kernel SVM to convert low dimensional data
into high dimensional data so that small hyper plane will be created.
Numerical example for Non-Linear SVM:
Solution: We have to find a hyper-plane i,e it divides data into two different
classes. But this is of type non-linear SVM so we need to convert data from
one feature space to another feature space using Kernal SVM.
***********************************************************************
θ(2,-2) => (√ 2²+2²) = (√8) > 2
***********************************************************************
so θ(x₁) = 4 +2 + |-2 +2| =>θ(x₁) = 6 and θ(x₂) = 4 +2 + |-2 +2| => θ(x₂) = 6
***********************************************************************
so θ(x₁) = 4 -2 + |-2 -2| =>θ(x₁) = 6 and θ(x₂) = 4 +2 + |-2 -2| => θ(x₂) = 10
***********************************************************************
Hence overall positive labelled data is rechanged as (2,2), (10,6), (6,6), (6,10)
Here all values are (1, 1) only thing is negative and positive sign changes, so
if we square root them then all values will be less than 2. Finally there is no
change in the points.
Final plot is
From the graph we can say support vectors are S₁= (1,1) and S₂ = (2,2) so
adding 1 to support vectors, S₁` = (1,1,1) and S₂` = (2,2,1).
α₁*s₁`*s₁`+ α₂*s₁`*s₂` = -1 → 1
α₁*s₂`*s₁`+ α₂*s₂`*s₂` = 1 → 2
α₁*(1,1,1)*(1,1,1)+ α₂(1,1,1)*(2,2,1) = -1 → 1
α₁*(2,2,1)*(1,1,1)+ α₂*(2,2,1)*(2,2,1) = 1 → 2
To find hyper-plane
W` = Σ αᵢ * sᵢ
W` = -7 * (1,1,1) + 4 * (2,2,1)
W` = (1,1,-3)
y = W`x + b
W` = 1,1 and b = 3
on drawing graph
At 3 hyper-plane is drawn.
import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
from matplotlib import style
from sklearn.svm import SVC
style.use('fivethirtyeight')
iris = pd.read_csv("iris-data.txt").values
fig, ax = plt.subplots()
X0, X1 = iris[:, 2], iris[:, 3]
xx, yy = make_meshgrid(X0, X1)
for i in range(len(iris)):
plt.scatter(iris[i][2], iris[i][3], s = 30, c = color[int(iris[i][4])])
plt.show()
Advantages:
Disadvantages:
It doesn’t perform well when we have large data set because the required
training time is higher
It also doesn’t perform very well, when the data set has more noise i.e.
target classes are overlapping
Responses (1)
Respond
Bhuvaneshwarmarri
9 months ago
Reply
Balaji C Balaji C
Lists
AI SageScribe ajaymehta