0% found this document useful (0 votes)
6 views3 pages

Support Vector Machine R程式練習2

The document provides a practical exercise using Support Vector Machines (SVM) in R, demonstrating the creation of a synthetic dataset and the application of SVM with radial and linear kernels. It includes code snippets for training and testing the model using the 'e1071' and 'ISLR' libraries, along with data summaries and prediction results. The exercise focuses on classifying data into four categories based on training and testing datasets.

Uploaded by

hubertkuo418
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views3 pages

Support Vector Machine R程式練習2

The document provides a practical exercise using Support Vector Machines (SVM) in R, demonstrating the creation of a synthetic dataset and the application of SVM with radial and linear kernels. It includes code snippets for training and testing the model using the 'e1071' and 'ISLR' libraries, along with data summaries and prediction results. The exercise focuses on classifying data into four categories based on training and testing datasets.

Uploaded by

hubertkuo418
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 3

Support Vector Machine : R 程式練習 2

411210002 郭玉皓

2024-12-24
library(e1071)

## Warning: 套件 'e1071' 是用 R 版本 4.4.2 來建造的

set.seed(1)
x = matrix(rnorm(50*2), ncol=2)
y = c(rep(0, 25), rep(1, 25))
x[y==0, 2] = x[y==0, 2]+2
dat = data.frame(x=x, y=as.factor(y))
par(mfrow=c(1, 1))
plot(x, col=(y+1))

svmfit = svm(y~., data=dat, kernel="radial", cost=10, gamma=1)


plot(svmfit, dat)
library(ISLR)

names(Khan)

## [1] "xtrain" "xtest" "ytrain" "ytest"

dim(Khan$xtrain)

## [1] 63 2308

dim(Khan$xtest)

## [1] 20 2308

length(Khan$ytrain)

## [1] 63

length(Khan$ytest)

## [1] 20

table(Khan$ytrain)

##
## 1 2 3 4
## 8 23 12 20

table(Khan$ytest)
##
## 1 2 3 4
## 3 6 6 5

dat = data.frame(x=Khan$xtrain, y=as.factor(Khan$ytrain))


out = svm(y~., data=dat, kernel="linear", cost=10)
summary(out)

##
## Call:
## svm(formula = y ~ ., data = dat, kernel = "linear", cost = 10)
##
##
## Parameters:
## SVM-Type: C-classification
## SVM-Kernel: linear
## cost: 10
##
## Number of Support Vectors: 58
##
## ( 20 20 11 7 )
##
##
## Number of Classes: 4
##
## Levels:
## 1 2 3 4

table(out$fitted, dat$y)

##
## 1 2 3 4
## 1 8 0 0 0
## 2 0 23 0 0
## 3 0 0 12 0
## 4 0 0 0 20

dat.te = data.frame(x=Khan$xtest, y=as.factor(Khan$ytest))


pred.te = predict(out, newdata=dat.te)
table(pred.te, dat.te$y)

##
## pred.te 1 2 3 4
## 1 3 0 0 0
## 2 0 6 2 0
## 3 0 0 4 0
## 4 0 0 0 5

You might also like