0% found this document useful (0 votes)
48 views2 pages

Assig Report Ex 2

This document describes an experiment using a neural network in MATLAB to approximate the XOR problem. It designs a neural network using a bipolar sigmoid activation function. The code initializes weights and biases, trains the network over multiple epochs using backpropagation, and calculates the root mean squared error (RMSE) as a performance index. A plot shows the RMSE decreasing with increasing number of iterations until it reaches below 0.005, at which point training stops.

Uploaded by

prakash
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
48 views2 pages

Assig Report Ex 2

This document describes an experiment using a neural network in MATLAB to approximate the XOR problem. It designs a neural network using a bipolar sigmoid activation function. The code initializes weights and biases, trains the network over multiple epochs using backpropagation, and calculates the root mean squared error (RMSE) as a performance index. A plot shows the RMSE decreasing with increasing number of iterations until it reaches below 0.005, at which point training stops.

Uploaded by

prakash
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 2

JAYPEE UNIVERSITY OF ENGINEERING AND TECHNOLOGY

ECE DESIGN AND SIMULATION LAB II REPORT


EXPERIMENT NO:2
AIM: Design a Neural network for approximation of XOR problem in MATLAB and
compare the performance index (RMSE) based on following parameters. Using bipolar
sigmoidal function.
SOFTWARE USED : Mat Lab 7.0
CODE :
clear all; close all;
c=1;
n=2;
m=1;
h=4;
N=4;
v=[0.197 0.3191 -0.1448 0.3394; 0.3099 0.1904 -0.0347 -0.4861];
v1=zeros(n,h);
w=[-0.3387 ; 0.2771 ; 0.2859; -0.3329];
w1=zeros(h,m);
b1=[-.3378 0.2771 0.2859 -0.3329];
b2=-0.1401;
x=[1 1 -1 -1 ; 1 -1 1 -1];
t=[-1 1 1 -1 ];
alpha=0.3;
mf=0.9;
con=1;
epoch=0;
while con
e=0;
EE=0;
for I=1:N
for j=1: h
zin(j)=b1(j);
for i=1:n
zin(j)=zin(j)+x(i,I)*v(i,j);
end
z(j)=bipsig(zin(j));
end

17

yin=b2+z*w;
y(I)=bipsig(yin);
delk=(t(I)-y(I))*bipsig1(yin);
delw=alpha*delk*z'+mf*(w-w1);
delb2=alpha*delk;
delinj=delk*w;

Page

for j=1:h
delj(j,1)=delinj(j,1)*bipsig1(zin(j));

REPORT BY PRAKASH CHANDRA BHARTI

APR 2012

JAYPEE UNIVERSITY OF ENGINEERING AND TECHNOLOGY


ECE DESIGN AND SIMULATION LAB II REPORT
end
for j=1:h
for i=1:n
delv(i,j)=alpha*delj(j,1)*x(i,I)+mf*(v(i,j)-v1(i,j));
end
end
delb1=alpha*delj;
w1=w;
v1=v;
w=w+delw;
b2=b2+delb2;
v=v+delv;
b1=b1+delb1';
e=e+(t(I)-y(I))^2;
end
Rmse=(sqrt(e))/N;
if e<0.005
con=0;
end
epoch=epoch+1;
xl1(epoch)=epoch;

end
disp ('Rmse')
disp(Rmse)
plot(xl1,yl1,'-+b')

0.6

0.5

RMSE

if epoch==50000
con=0;
end
if (epoch<60000)
yl1(epoch)=Rmse;
end

NO OF ITERATION VS RMSE
0.7

0.4

0.3

0.2

0.1

20

40

60
80
NO OF ITERATION

100

120

Page

18

FIG : NO OF ITERATION VS RMSE

140

REPORT BY PRAKASH CHANDRA BHARTI

APR 2012

You might also like