0% found this document useful (0 votes)
37 views4 pages

Autoencoder Example

Deep neural network applied to simple 20x20 input matrix example using 20x8x3x8x20 autoencoder structure

Uploaded by

naresh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
37 views4 pages

Autoencoder Example

Deep neural network applied to simple 20x20 input matrix example using 20x8x3x8x20 autoencoder structure

Uploaded by

naresh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

Autoencoder example: 20-8-3-8-20

Let us take an example with:


Input=output=
1
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0

0
1
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0

0
0
1
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0

0
0
0
1
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0

0
0
0
0
1
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0

0
0
0
0
0
1
0
0
0
0
0
0
0
0
0
0
0
0
0
0

0
0
0
0
0
0
1
0
0
0
0
0
0
0
0
0
0
0
0
0

0
0
0
0
0
0
0
1
0
0
0
0
0
0
0
0
0
0
0
0

0
0
0
0
0
0
0
0
1
0
0
0
0
0
0
0
0
0
0
0

0
0
0
0
0
0
0
0
0
1
0
0
0
0
0
0
0
0
0
0

0
0
0
0
0
0
0
0
0
0
1
0
0
0
0
0
0
0
0
0

0
0
0
0
0
0
0
0
0
0
0
1
0
0
0
0
0
0
0
0

0
0
0
0
0
0
0
0
0
0
0
0
1
0
0
0
0
0
0
0

0
0
0
0
0
0
0
0
0
0
0
0
0
1
0
0
0
0
0
0

0
0
0
0
0
0
0
0
0
0
0
0
0
0
1
0
0
0
0
0

0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
1
0
0
0
0

0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
1
0
0
0

0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
1
0
0

0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
1
0

0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
1

This example is solved by diving this network into 3 parts:


(1) 20-8-20 with input to hidden weight w1, hidden layer h1 and output same as above ouput:
The sum of squared error is shown below:

(2) 8-3-20 with input as h1, input to hidden weight w2, hidden layer h2 and output same as above ouput:
The sum of squared error is shown below:

(3) 3-8-20 with input as h2, input to hidden weight w3, hidden layer h3 and output same as above ouput:
The sum of squared error in this case is shown below:

0.81026
1.77E-10
0.02471
2.20E-07
1.42E-07
0.00117
7.81E-08
3.69E-09
7.37E-15
0.0001
1.21E-11
1.21E-07
5.32E-10
0.00019
0.00014
0.00086
0.00381
0.10856
8.91E-13
4.81E-13

2.05E-08
0.882378
0.001181
4.25E-05
1.74E-07
4.58E-07
2.33E-05
0.008321
0.000396
5.57E-05
0.000168
0.019322
0.000733
1.25E-06
3.46E-08
3.44E-06
5.55E-07
2.51E-10
0.006968
0.077925

0.038985
0.008613
0.912024
7.60E-05
5.17E-05
0.001903
0.002765
0.001666
6.26E-07
0.00348
0.000284
0.001301
3.55E-06
0.001447
0.002521
0.002731
0.004843
0.000282
8.99E-05
7.91E-06

2.31E-10
7.07E-06
5.33E-10
0.92833
0.01423
9.35E-12
2.05E-10
1.08E-10
3.79E-10
3.06E-13
7.97E-07
0.01502
0.01571
0.00941
3.70E-08
1.38E-12
3.59E-11
1.06E-09
1.34E-09
1.08E-05

2.76E-05
5.49E-09
1.42E-06
0.026161
0.911042
1.83E-07
1.28E-05
1.59E-07
1.43E-07
2.20E-09
0.028444
7.54E-07
7.09E-07
0.022988
0.016072
8.26E-08
6.62E-06
0.000464
1.11E-05
7.28E-09

0.00989
1.08E-07
0.0007
4.29E-08
4.58E-08
0.93249
0.01774
4.26E-10
1.34E-08
4.83E-06
8.85E-06
3.26E-09
8.03E-09
1.50E-05
3.85E-06
0.00011
0.04833
0.00231
2.39E-12
6.62E-07

1.72E-06
3.47E-07
4.39E-05
2.24E-09
3.87E-07
0.018492
0.96582
9.11E-10
9.40E-05
3.17E-07
0.014455
8.93E-11
1.82E-09
5.91E-05
0.003987
1.90E-05
0.001425
6.41E-06
3.26E-07
1.21E-05

1.86E-08
0.007279
6.92E-06
3.55E-06
1.12E-08
9.57E-08
3.32E-10
0.897088
2.23E-06
0.038532
1.22E-07
0.000393
3.10E-05
6.31E-10
4.74E-12
0.001024
9.51E-06
3.97E-10
0.033915
0.000107

5.12E-09
0.000524
2.06E-07
2.73E-08
5.55E-07
7.19E-06
0.01902
5.19E-05
0.91091
4.14E-06
0.014912
2.92E-07
1.39E-05
1.03E-07
1.60E-05
1.79E-06
6.58E-07
4.31E-08
0.033084
0.036129

7.49E-06
0.000304
0.000236
6.72E-09
3.07E-11
1.36E-05
3.54E-08
0.044654
2.50E-07
0.718331
6.93E-11
3.16E-05
3.45E-06
6.60E-09
3.94E-09
0.120595
0.000239
2.02E-07
0.00023
7.37E-06

8.18E-09
1.79E-05
1.18E-06
3.61E-05
0.009656
1.86E-06
0.018687
3.10E-06
0.00428
3.40E-09
0.976487
4.94E-08
2.48E-07
2.59E-05
6.81E-05
3.52E-08
3.30E-06
3.77E-08
0.002767
0.000112

7.55E-05
0.009279
0.000832
0.021435
1.79E-05
2.47E-09
1.97E-09
0.000128
2.60E-08
3.53E-05
6.03E-10
0.917988
0.049269
0.001559
3.02E-05
1.83E-06
2.18E-08
3.56E-06
1.28E-05
1.57E-05

3.04E-12
0.000824
1.07E-11
0.015042
2.40E-07
8.44E-12
2.49E-10
9.46E-09
2.94E-06
3.77E-09
9.77E-11
0.123221
0.884166
6.83E-05
3.18E-09
9.61E-10
1.55E-11
7.29E-11
7.70E-07
0.053198

Output matrix without applying winner takes all principle:

3.76E-05
4.77E-09
1.67E-06
0.0195
0.00127
5.91E-07
2.71E-05
9.68E-16
1.23E-12
1.46E-12
2.61E-08
1.69E-05
1.51E-05
0.95786
0.03697
2.81E-10
9.77E-08
0.00024
2.59E-14
1.17E-08

0.001563
9.76E-10
0.000274
1.21E-07
3.24E-05
9.20E-06
0.015759
2.15E-11
9.31E-09
4.39E-08
6.07E-07
1.31E-08
5.29E-09
0.037815
0.958112
1.93E-06
1.06E-05
0.007166
6.89E-09
5.26E-10

6.93E-09
6.23E-14
1.39E-08
1.12E-15
2.22E-14
5.02E-06
5.01E-08
1.11E-06
5.55E-13
0.1229927
5.06E-13
2.44E-16
2.61E-15
1.84E-11
6.16E-10
0.8202828
0.0502336
1.12E-08
6.47E-07
1.16E-13

2.19E-06
2.80E-14
2.20E-06
9.44E-14
1.79E-10
0.04255
0.00047
7.52E-09
6.34E-13
0.00018
6.97E-07
9.69E-18
6.36E-17
2.88E-09
4.02E-08
0.12112
0.90445
1.97E-06
1.67E-09
8.59E-14

0.15007
5.56E-12
4.11E-06
9.36E-06
0.0005
0.00562
4.58E-05
3.08E-10
5.25E-10
5.38E-06
5.03E-08
6.45E-09
6.57E-08
0.0138
0.05895
0.00054
0.00961
0.86343
1.59E-10
1.51E-10

5.08E-12
0.000196
9.53E-09
6.99E-07
8.85E-07
3.12E-10
1.24E-06
0.031623
0.004807
6.64E-05
9.81E-05
1.19E-06
1.11E-05
5.39E-08
6.83E-08
2.82E-05
2.29E-07
2.68E-11
0.978672
0.000524

3.23E-13
0.066767
7.67E-10
9.65E-06
9.09E-09
3.30E-07
2.96E-05
4.26E-06
0.01598
2.47E-07
0.000143
2.46E-05
0.005067
8.74E-08
4.19E-11
1.52E-07
9.44E-08
1.13E-12
0.000293
0.939515

Output matrix after applying winner takes all principle:


1
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0

0
1
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0

0
0
1
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0

0
0
0
1
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0

0
0
0
0
1
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0

0
0
0
0
0
1
0
0
0
0
0
0
0
0
0
0
0
0
0
0

0
0
0
0
0
0
1
0
0
0
0
0
0
0
0
0
0
0
0
0

0
0
0
0
0
0
0
1
0
0
0
0
0
0
0
0
0
0
0
0

0
0
0
0
0
0
0
0
1
0
0
0
0
0
0
0
0
0
0
0

0
0
0
0
0
0
0
0
0
1
0
0
0
0
0
0
0
0
0
0

0
0
0
0
0
0
0
0
0
0
1
0
0
0
0
0
0
0
0
0

0
0
0
0
0
0
0
0
0
0
0
1
0
0
0
0
0
0
0
0

0
0
0
0
0
0
0
0
0
0
0
0
1
0
0
0
0
0
0
0

0
0
0
0
0
0
0
0
0
0
0
0
0
1
0
0
0
0
0
0

0
0
0
0
0
0
0
0
0
0
0
0
0
0
1
0
0
0
0
0

0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
1
0
0
0
0

0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
1
0
0
0

0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
1
0
0

0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
1
0

0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
1

You might also like