Lecture 07 Multiple Dimension Input
Lecture 07 Multiple Dimension Input
Lecturer : Hongpu Liu Lecture 7-1 PyTorch Tutorial @ SLAM Research Group
Revision
Lecturer : Hongpu Liu Lecture 7-2 PyTorch Tutorial @ SLAM Research Group
Diabetes Dataset
X1 X2 X3 X4 X5 X6 X7 X8 Y
-0.29 0.49 0.18 -0.29 0.00 0.00 -0.53 -0.03 0
-0.88 -0.15 0.08 -0.41 0.00 -0.21 -0.77 -0.67 1 Sample
-0.06 0.84 0.05 0.00 0.00 -0.31 -0.49 -0.63 0
-0.88 -0.11 0.08 -0.54 -0.78 -0.16 -0.92 0.00 1
0.00 0.38 -0.34 -0.29 -0.60 0.28 0.89 -0.60 0
-0.41 0.17 0.21 0.00 0.00 -0.24 -0.89 -0.70 1
-0.65 -0.22 -0.18 -0.35 -0.79 -0.08 -0.85 -0.83 0
0.18 0.16 0.00 0.00 0.00 0.05 -0.95 -0.73 1
-0.76 0.98 0.15 -0.09 0.28 -0.09 -0.93 0.07 0
-0.06 0.26 0.57 0.00 0.00 0.00 -0.87 0.10 0
Lecturer : Hongpu Liu Lecture 7-3 PyTorch Tutorial @ SLAM Research Group
Diabetes Dataset
X1 X2 X3 X4 X5 X6 X7 X8 Y
-0.29 0.49 0.18 -0.29 0.00 0.00 -0.53 -0.03 0
-0.88 -0.15 0.08 -0.41 0.00 -0.21 -0.77 -0.67 1
-0.06 0.84 0.05 0.00 0.00 -0.31 -0.49 -0.63 0
-0.88 -0.11 0.08 -0.54 -0.78 -0.16 -0.92 0.00 1
0.00 0.38 -0.34 -0.29 -0.60 0.28 0.89 -0.60 0
-0.41 0.17 0.21 0.00 0.00 -0.24 -0.89 -0.70 1
-0.65 -0.22 -0.18 -0.35 -0.79 -0.08 -0.85 -0.83 0
0.18 0.16 0.00 0.00 0.00 0.05 -0.95 -0.73 1
-0.76 0.98 0.15 -0.09 0.28 -0.09 -0.93 0.07 0
-0.06 0.26 0.57 0.00 0.00 0.00 -0.87 0.10 0
Feature
Lecturer : Hongpu Liu Lecture 7-4 PyTorch Tutorial @ SLAM Research Group
Multiple Dimension Logistic Regression Model
Lecturer : Hongpu Liu Lecture 7-5 PyTorch Tutorial @ SLAM Research Group
Multiple Dimension Logistic Regression Model
Lecturer : Hongpu Liu Lecture 7-6 PyTorch Tutorial @ SLAM Research Group
Multiple Dimension Logistic Regression Model
Lecturer : Hongpu Liu Lecture 7-7 PyTorch Tutorial @ SLAM Research Group
Mini-Batch (N samples)
Lecturer : Hongpu Liu Lecture 7-8 PyTorch Tutorial @ SLAM Research Group
Mini-Batch (N samples)
𝜔1
𝑧 (1) = 𝑥1(1) ⋯ 𝑥8
(1) ⋮ +𝑏
𝜔8
⋮
𝜔1
𝑧 (𝑁) = 𝑥1(𝑁) ⋯ 𝑥8
(𝑁) ⋮ +𝑏
𝜔8
Lecturer : Hongpu Liu Lecture 7-9 PyTorch Tutorial @ SLAM Research Group
Mini-Batch (N samples)
𝜔1
𝑧 (1) = 𝑥1(1) ⋯ 𝑥8
(1) ⋮ +𝑏 (1) (1)
𝜔8 𝑧 (1) 𝑥1 … 𝑥8 𝜔1 𝑏
⋮ ⋮ = ⋮ ⋱ ⋮ ⋮ + ⋮
𝜔1 𝑧 (𝑁) (𝑁)
𝑥1
(𝑁)
… 𝑥8 𝜔8 𝑏
𝑧 (𝑁) = 𝑥1(𝑁) ⋯ 𝑥8
(𝑁) ⋮ +𝑏
𝜔8 𝑁×1 𝑁×8 8×1 𝑁×1
Lecturer : Hongpu Liu Lecture 7-10 PyTorch Tutorial @ SLAM Research Group
Mini-Batch (N samples)
class Model(torch.nn.Module):
def __init__(self):
𝑦ො (1) 𝜎(𝑧 (1) ) 𝑧 (1) super(Model, self).__init__()
self.linear = torch.nn.Linear(8, 1)
⋮ = ⋮ = 𝜎( ⋮ ) self.sigmoid = torch.nn.Sigmoid()
𝑦ො (𝑁) 𝜎(𝑧 (𝑁) ) 𝑧 (𝑁) def forward(self, x):
x = self.sigmoid(self.linear(x))
return x
model = Model()
𝜔1
𝑧 (1) = 𝑥1(1) ⋯ 𝑥8
(1) ⋮ +𝑏 (1) (1)
𝜔8 𝑧 (1) 𝑥1 … 𝑥8 𝜔1 𝑏
⋮ ⋮ = ⋮ ⋱ ⋮ ⋮ + ⋮
𝜔1 𝑧 (𝑁) 𝑥1
(𝑁) (𝑁)
… 𝑥8 𝜔8 𝑏
𝑧 (𝑁) = 𝑥1(𝑁) ⋯ 𝑥8
(𝑁) ⋮ +𝑏
𝜔8 𝑁×1 𝑁×8 8×1 𝑁×1
Lecturer : Hongpu Liu Lecture 7-11 PyTorch Tutorial @ SLAM Research Group
Linear Layer
self.linear = torch.nn.Linear(8, 1)
8×1 𝑊 𝑏
Lecturer : Hongpu Liu Lecture 7-12 PyTorch Tutorial @ SLAM Research Group
Linear Layer
self.linear = torch.nn.Linear(8, 2)
Linear Layer
(1) (1) (1) (1)
𝑥1 … 𝑥8 𝑜1 𝑜2
𝑋= ⋮ ⋱ ⋮ 𝑋 𝑜 = 𝜎(𝑋 ∙ 𝑊 + 𝑏) 𝑂 𝑂= ⋮ ⋮
(𝑁) (𝑁) (𝑁) (𝑁)
𝑥1 … 𝑥8 𝑜1 𝑜2
8×2 𝑊 𝑏
Lecturer : Hongpu Liu Lecture 7-13 PyTorch Tutorial @ SLAM Research Group
Linear Layer
self.linear = torch.nn.Linear(8, 6)
Linear Layer
(1) (1) (1) (1)
𝑥1 … 𝑥8 𝑜1 ⋯ 𝑜6
𝑋= ⋮ ⋱ ⋮ 𝑋 𝑜 = 𝜎(𝑋 ∙ 𝑊 + 𝑏) 𝑂 𝑂= ⋮ ⋱ ⋮
(𝑁) (𝑁) (𝑁) (𝑁)
𝑥1 … 𝑥8 𝑜1 ⋯ 𝑜6
8×6 𝑊 𝑏
Lecturer : Hongpu Liu Lecture 7-14 PyTorch Tutorial @ SLAM Research Group
Neural Network
𝑋 𝜎(𝑋 ∙ 𝑊1 + 𝑏1 ) 𝑋1
𝑂 𝜎(𝑋 ∙ 𝑊2 + 𝑏2 ) 𝑂2 ……
𝑊1 𝑏1 𝑊2 𝑏2
Lecturer : Hongpu Liu Lecture 7-15 PyTorch Tutorial @ SLAM Research Group
Example: Artificial Neural Network
𝑋 ∈ ℝ𝑁×8
𝑂1 ∈ ℝ𝑁×6
𝑂2 ∈ ℝ𝑁×4
𝑌 ∈ ℝ𝑁×1
Lecturer : Hongpu Liu Lecture 7-16 PyTorch Tutorial @ SLAM Research Group
Example: Diabetes Prediction
X1 X2 X3 X4 X5 X6 X7 X8 Y
-0.29 0.49 0.18 -0.29 0.00 0.00 -0.53 -0.03 0
-0.88 -0.15 0.08 -0.41 0.00 -0.21 -0.77 -0.67 1
-0.06 0.84 0.05 0.00 0.00 -0.31 -0.49 -0.63 0
-0.88 -0.11 0.08 -0.54 -0.78 -0.16 -0.92 0.00 1
0.00 0.38 -0.34 -0.29 -0.60 0.28 0.89 -0.60 0
-0.41 0.17 0.21 0.00 0.00 -0.24 -0.89 -0.70 1
-0.65 -0.22 -0.18 -0.35 -0.79 -0.08 -0.85 -0.83 0
0.18 0.16 0.00 0.00 0.00 0.05 -0.95 -0.73 1
-0.76 0.98 0.15 -0.09 0.28 -0.09 -0.93 0.07 0
-0.06 0.26 0.57 0.00 0.00 0.00 -0.87 0.10 0
Lecturer : Hongpu Liu Lecture 7-17 PyTorch Tutorial @ SLAM Research Group
Example: Diabetes Prediction
Lecturer : Hongpu Liu Lecture 7-18 PyTorch Tutorial @ SLAM Research Group
Example: 1. Prepare Dataset
import numpy as np
xy = np.loadtxt('diabetes.csv.gz', delimiter=',', dtype=np.float32)
x_data = torch.from_numpy(xy[:,:-1])
y_data = torch.from_numpy(xy[:, [-1]])
Lecturer : Hongpu Liu Lecture 7-19 PyTorch Tutorial @ SLAM Research Group
Example: 2. Define Model
model = Model()
𝑌 ∈ ℝ𝑁×1
Lecturer : Hongpu Liu Lecture 7-20 PyTorch Tutorial @ SLAM Research Group
Example: 3. Construct Loss and Optimizer
criterion = torch.nn.BCELoss(size_average=True)
Lecturer : Hongpu Liu Lecture 7-21 PyTorch Tutorial @ SLAM Research Group
Example: 4. Training Cycle
# Update
optimizer.step()
Lecturer : Hongpu Liu Lecture 7-22 PyTorch Tutorial @ SLAM Research Group
Exercise: Try different activate function
http://rasbt.github.io/mlxtend/user_guide/general_concepts/activation-functions/#activation-functions-for-artificial-neural-networks
Lecturer : Hongpu Liu Lecture 7-23 PyTorch Tutorial @ SLAM Research Group
Exercise: Try different activate function
https://dashee87.github.io/data%20science/deep%20learning/visualising-activation-functions-in-neural-networks/
Lecturer : Hongpu Liu Lecture 7-24 PyTorch Tutorial @ SLAM Research Group
Exercise: Try different activate function
https://pytorch.org/docs/stable/nn.html#non-linear-activations-weighted-sum-nonlinearity
Lecturer : Hongpu Liu Lecture 7-25 PyTorch Tutorial @ SLAM Research Group
Exercise: Try different activate function
import torch
class Model(torch.nn.Module):
def __init__(self):
super(Model, self).__init__()
self.linear1 = torch.nn.Linear(8, 6)
self.linear2 = torch.nn.Linear(6, 4)
self.linear3 = torch.nn.Linear(4, 1)
self.activate = torch.nn.ReLU()
model = Model()
Lecturer : Hongpu Liu Lecture 7-26 PyTorch Tutorial @ SLAM Research Group
PyTorch Tutorial
07. Multiple Dimension Input
Lecturer : Hongpu Liu Lecture 7-27 PyTorch Tutorial @ SLAM Research Group