0% found this document useful (0 votes)
97 views

Algoritma Perceptron

The document describes a single layer perceptron neural network with two inputs (X1, X2) and one output (Y). It shows the network architecture, calculation of network output using weights (W1, W2) and threshold (θ), and an example of training the network on an XOR problem by adjusting the weights to minimize error between the actual and desired outputs. The network is able to learn the XOR function through iterative adjustment of the weights.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
97 views

Algoritma Perceptron

The document describes a single layer perceptron neural network with two inputs (X1, X2) and one output (Y). It shows the network architecture, calculation of network output using weights (W1, W2) and threshold (θ), and an example of training the network on an XOR problem by adjusting the weights to minimize error between the actual and desired outputs. The network is able to learn the XOR function through iterative adjustment of the weights.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 17

X1 W1 Linear

Kombinasi
Hard


Limited

Input Y
Output

Hidden
X2 W2
Neuron Threshold 
Gambar ANN : Single Layer two input perceptron
by Rosenblatt, 1958
 
n
Xi(p). Wi(p) - 
n=1

Dimana :
n = urutan input perceptron
Step = activasi function
Dimana :
e = error yang diperoleh
Input Variabel AND OR Exclusive OR
X1 X2 X1  X2 X1  X2 X1  X2
0 0 0 0 0
0 1 0 1 1
1 0 0 1 1
1 1 1 1 0
Input Desired Initial Weight Actual Error Final Weight
Output Output
X1 X2 Yd W1 W2 Y e W1 W2
0 0 0,3 -0,1
0 1
1 0
1 1
 
n
Xi(p). Wi(p) - 
n=1
[email protected]
Input Desired Initial Weight Actual Error Final Weight
Output Output
X1 X2 Yd W1 W2 Y e W1 W2
0 0 0 0,3 -0,1 0 0 0,3 -0,1
0 1 0 0,3 -0,1 0 0 0,3 -0,1
1 0 0 0,3 -0,1 1 -1 0,2 -0,1
1 1 1 0,2 -0,1 0 1 0,3 0,0
Input Desired Initial Weight Actual Error Final Weight
Output Output
X1 X2 Yd W1 W2 Y e W1 W2
0 0 0 0,3 0.0 0 0 0,3 0,0
0 1 0 0,3 0.0 0 0 0,3 0,0
1 0 0 0,3 0.0 1 -1 0,2 0,0
1 1 1 0,2 0.0 1 0 0,2 0,0
Input Desired Initial Weight Actual Error Final Weight
Output Output
X1 X2 Yd W1 W2 Y e W1 W2
0 0 0 0,2 0,0 0 0 0,2 0,0
0 1 0 0,2 0,0 0 0 0,2 0,0
1 0 0 0,2 0,0 1 -1 0,1 0,0
1 1 1 0,1 0,0 0 1 0,2 0,1
Input Desired Initial Weight Actual Error Final Weight
Output Output
X1 X2 Yd W1 W2 Y e W1 W2
0 0 0 0,2 0,1 0 0 0,2 0,1
0 1 0 0,2 0,1 0 0 0,2 0,1
1 0 0 0,2 0,1 1 -1 0,1 0,1
1 1 1 0,1 0,1 1 0 0,1 0,1
Input Desired Initial Weight Actual Error Final Weight
Output Output
X1 X2 Yd W1 W2 Y e W1 W2
0 0 0 0,3 -0,1 0 0 0,3 -0,1
0 1 1 0,3 -0,1 0 1 0,3 0
1 0 1 0,3 0 1 0 0,3 0
1 1 1 0,3 0 1 0 0,3 0
Input Desired Initial Weight Actual Error Final Weight
Output Output
X1 X2 Yd W1 W2 Y e W1 W2
0 0 0 0,1 0,3
0 1 1
1 0 1
1 1 1

You might also like