0% found this document useful (0 votes)
24 views

Answer PDF

The document contains code to perform linear regression on two datasets. For the second dataset given: - The data is loaded with x as column 1 and y as column 2 of the input matrix - Linear regression is performed to find coefficients a and b that minimize the least squares error - The best fit line and original data points are plotted on a graph - The linear coefficients are printed, showing an intercept of 52.448270 and slope of -0.019515
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as ODT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
24 views

Answer PDF

The document contains code to perform linear regression on two datasets. For the second dataset given: - The data is loaded with x as column 1 and y as column 2 of the input matrix - Linear regression is performed to find coefficients a and b that minimize the least squares error - The best fit line and original data points are plotted on a graph - The linear coefficients are printed, showing an intercept of 52.448270 and slope of -0.019515
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as ODT, PDF, TXT or read online on Scribd
You are on page 1/ 4

Problem no 7 code:

xstep = 0.1;
tstep = 0.05;
xstep2 = xstep*xstep;
tstep2 = tstep*tstep;
alpha = 2;
alpha2 = alpha*alpha;
lambda2 = alpha2*tstep2/xstep2;
xdomain = [0 1];
tdomain = [0 1];
nx = round((xdomain(2)-xdomain(1))/xstep);
nt = round((tdomain(2)-tdomain(1))/tstep);
xt0 = zeros((nx+1),1); % initial condition
dxdt0 = zeros((nx+1),1); % initial derivative
xold = zeros((nx+1),1); % solution at timestep k
x2old = zeros((nx+1),1); % solution at timestep k-1
xnew = zeros((nx+1),1); % solution at timestep k+1
% initial condition
(nx+1)==16;
pi = acos(-1.0);
for i=1:nx+1
xi = (i-1)*xstep;
if(xi>=0 && xi<=1)
xt0(i) = sin(2*pi*xi);
dxdt0(i) = alpha*pi*sin(2*pi*xi);
xold(i) = xt0(i)+dxdt0(i)*tstep;
xold(i) = xold(i) - 4*pi*pi*sin(2*pi*xi)*tstep2*alpha2;
end
end
close all
syms x
t=0.3;
x=linspace(xdomain(1),xdomain(2),nx+1);
analy= sin(2*pi*x)*(sin(4*pi*t)+cos(4*pi*t));
h1=plot(x,analy,'linewidth',2);
hold on;
h2=plot(x,xold(:,end),'linewidth',2); % Index issue in xold
hold on;
h3=plot(x,xnew(:,end),'linewidth',2);
hold off
legend('Analytical','Initial','Final')
xlabel('x [m]');
ylabel('Displacement [m]');
set(gca,'FontSize',16);
h.Visible = 'off';
for k=2:nt
time = i*tstep;
for i=1:nx+1
% Use periodic boundary condition, u(nx+1)=u(1)
if(i==1)
xnew(i) = 2*(1-lambda2)*xold(i) + lambda2*(xold(i+1)+xold(nx+1)) - x2old(i);
elseif(i==nx+1)
xnew(i) = 2*(1-lambda2)*xold(i) + lambda2*(xold(1)+xold(i-1)) - x2old(i);
else
xnew(i) = 2*(1-lambda2)*xold(i) + lambda2*(xold(i+1)+xold(i-1)) - x2old(i);
end
end
x2old=xold;
xold = xnew;
if(mod(k,2)==0)
h3.YData = xnew;
refreshdata(h3);
pause(0.5);
end
end
Problem no 8:

input = [...
1955, 12.7;...
1225, 30.1;...
1008, 33.9;...
1323, 21.3;...
710, 41.6;....
1350, 25.5;...
1436, 21.1;...
1561, 22.6;...
2120, 13.3;...
2110, 13.6];
m = size(input, 1);
n = size(input, 2);
x = input(:,1:n-1);
y = input(:,n);
% The first column of matrix X is populated with ones,
% and the rest columns are the x columns of the input.
X = ones(m, n);
X(:,2:n) = input(:,1:n-1);
% Try to find the a that minimizes the least square error Xa - y.
% Project y onto the C(X) will give us b which is Xa.
% The relationship is X'Xa = X'b
% Use left division \ to solve the equation, which is equivalent
% to a = inverse(X'*X)*X'*y, but computationally cheaper.
a = (X' * X) \ (X' * y)
b = X*a
least_square_error = sum((b - y) .^ 2)
% Plot the best fit line.
plot(x, b);
title(sprintf('y = %f + %fx', a(1), a(2)));
xlabel('x');
ylabel('y');
hold on;
% Plot the input data.
plot(x, y, '+r');
hold off;
pause;
m = size(input, 1);
n = size(input, 2);
x = input(:,1:n-1);
y = input(:,n);
% The first column of matrix X is populated with ones,
% and the rest columns are the x columns of the input.
X = ones(m, n);
X(:,2:n) = input(:,1:n-1);
% Try to find the a that minimizes the least square error Xa - y.
% Project y onto the C(X) will give us b which is Xa.
% The relationship is X'Xa = X'b
% Use left division \ to solve the equation, which is equivalent
% to a = inverse(X'*X)*X'*y, but computationally cheaper.
a = (X' * X) \ (X' * y)
b = X*a
least_square_error = sum((b - y) .^ 2)
% Plot the best fit line.
plot(x, b);
title(sprintf('y = %f + %fx', a(1), a(2)));
xlabel('x');
ylabel('y');
hold on;
% Plot the input data.
plot(x, y, '+r');
hold off;
pause;

Since the first part data is not given so i can’t sovle according to that you can put the data and
compute further, here i’ve solved the for thr given data and used w data as x, and c data as y the
linear cofficient a=52.448270,b=-0.019515

You might also like