0% found this document useful (0 votes)
121 views

THT SKC Matlab

The document describes training a neural network model to classify data. It initializes the weights and biases, then trains the model over 1,106 epochs using backpropagation. The training reduces the sum of squared errors, improving the model's ability to classify the input data.

Uploaded by

Abi Azano
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
121 views

THT SKC Matlab

The document describes training a neural network model to classify data. It initializes the weights and biases, then trains the model over 1,106 epochs using backpropagation. The training reduces the sum of squared errors, improving the model's ability to classify the input data.

Uploaded by

Abi Azano
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 3

SCRIPT:

nntwarn off
p=[1 1
4 2
2 1
1 4]
t=[0
1
0
1]
p=p'
t=t'
[w,b]=initp(p,t)
tp=[1 20]
[w,b]=trainp(w,b,p,t,tp)

COMMAND WINDOW:
p=
1
4
2
1

1
2
1
4

t=
0
1
0
1
p=
1 4 2 1
1 2 1 4
t=
0 1 0 1
w=
0.9003 -0.5377
b=
0.2137
tp =
1 20
TRAINP: 0/20 epochs, SSE = 3.
TRAINP: 1/20 epochs, SSE = 1.
TRAINP: 2/20 epochs, SSE = 2.
TRAINP: 3/20 epochs, SSE = 0.
w=
-0.0997 1.4623
b=
-1.7863

Sript Matlab

nntwarn
p=[7000
7500
6500
8500
5500
7500
t=[0
1
0
1
1
0

off
7500
8000
7750
7500
9000
7500]

1
0
1
0
0
1]

p=p/9000
p=p'
t=t'
tp=[50 100000 0.1 0.1]
[w1,b1,w2,b2]=initff(p,10,'logsig',t,'logsig')
[w1,b1,w2,b2]=trainbp(w1,b1,'logsig',w2,b2,'logsig',p,t,tp)
Command window
p=
7000
7500
7500
8000
6500
7750
8500
7500
5500
9000
7500
7500
t=
0 1
1 0
0 1
1 0
1 0
0 1

p=
0.7778 0.8333
0.8333 0.8889
0.7222 0.8611
0.9444 0.8333
0.6111 1.0000
0.8333 0.8333
p=
0.7778 0.8333 0.7222 0.9444 0.6111 0.8333
0.8333 0.8889 0.8611 0.8333 1.0000 0.8333
t=
0 1 0 1 1 0
1 0 1 0 0 1
tp =
1.0e+005 *
0.0005 1.0000 0.0000 0.0000
w1 =
51.4611 26.3936
-35.9897 78.1572
13.0446 102.9998
-3.1209 106.0690
40.9333 -67.7308
49.9895 -35.9692
-5.2845 105.7256
-40.1631 69.5506
51.1696 -28.5708
-7.3902 105.2195
b1 =
-72.0487
-46.2576

-99.0165
-103.4824
23.8548
-11.1727
-98.1402
-30.6787
-17.6429
-96.0369
w2 =
Columns 1 through 7
-3.2800 -0.3715 -0.2301 2.3428 -2.0121 2.2879 1.2266
1.8648 3.2629 -0.6147 0.1901 1.3007 -3.6298 -0.9107
Columns 8 through 10
2.2451 1.4174 -1.3221
0.0213 -0.5373 -2.3451
b2 =
-2.9921
1.7926
TRAINBP: 0/100000 epochs, SSE = 3.20353.
TRAINBP: 50/100000 epochs, SSE = 1.1422.
TRAINBP: 100/100000 epochs, SSE = 0.741457.
TRAINBP: 150/100000 epochs, SSE = 0.541136.
TRAINBP: 200/100000 epochs, SSE = 0.429311.
TRAINBP: 250/100000 epochs, SSE = 0.35846.
TRAINBP: 300/100000 epochs, SSE = 0.309142.
TRAINBP: 350/100000 epochs, SSE = 0.272521.
TRAINBP: 400/100000 epochs, SSE = 0.244068.
TRAINBP: 450/100000 epochs, SSE = 0.221218.
TRAINBP: 500/100000 epochs, SSE = 0.202403.
TRAINBP: 550/100000 epochs, SSE = 0.186606.
TRAINBP: 600/100000 epochs, SSE = 0.17313.
TRAINBP: 650/100000 epochs, SSE = 0.161486.
TRAINBP: 700/100000 epochs, SSE = 0.151314.
TRAINBP: 750/100000 epochs, SSE = 0.142346.
TRAINBP: 800/100000 epochs, SSE = 0.134377.
TRAINBP: 850/100000 epochs, SSE = 0.127246.
TRAINBP: 900/100000 epochs, SSE = 0.120826.
TRAINBP: 950/100000 epochs, SSE = 0.115014.
TRAINBP: 1000/100000 epochs, SSE = 0.109727.
TRAINBP: 1050/100000 epochs, SSE = 0.104897.
TRAINBP: 1100/100000 epochs, SSE = 0.100466.
TRAINBP: 1106/100000 epochs, SSE = 0.0999592.
w1 =
50.7531 25.7660
-35.9545 78.2004
13.0300 102.6139
-3.0427 106.1969
41.2373 -67.4506
48.8917 -37.0687
-5.2740 105.7419
-40.6828 68.8903
51.7742 -27.9042
-7.3807 105.2347
b1 =
-72.8004
-46.2072
-99.4720
-103.3545
24.2956
-12.5625
-98.1237
-31.4596
-16.9994
-96.0216
w2 =
Columns 1 through 7
-3.0844 -0.1500 3.4771 2.5062 -2.3038 4.3449 1.4602
1.7512 2.5800 -4.0096 -0.2998 1.8151 -4.7451 -1.6035
Columns 8 through 10
1.7710 3.0580 -1.0878
0.1324 -1.7261 -3.0401
b2 =
-3.1340
2.6273

You might also like