All Courses
All Courses
Courses by Software
Courses by Semester
Courses by Domain
Tool-focused Courses
Machine learning
POPULAR COURSES
Success Stories
Aim : Curve fitting using regression. Objective : Basic information about parameters in regression. Curve fitting using regression by least square method. Theory : Regression means is to findout the correlation between two parameters. …
Epuri Yoga Narasimha
updated on 20 Nov 2020
Aim : Curve fitting using regression.
Objective : Basic information about parameters in regression.
Curve fitting using regression by least square method.
Theory : Regression means is to findout the correlation between two parameters.
It is a very helfpul technique to predict the things and forcasting using data avaialble.
Using polynimials , very easy to analyse the data.
Training and Testing the model:
Genrally , a sample of data is taken from the given data and make a best model ,
and by using the remaining data analyse the curve whether the model is good at
predictions or not.
Here i used least square method.
Parameters used :
1. ˆyˆy- called y hat , represents predicted value , by regressor line.
2. y - considered as actual value or targeted value , given by data.
3. Mean ˉx¯x - Average of the given data.
suppose given data is x - size : (1,20)
ˉx¯x = ∑xn∑xn.
n - represents number of data in sample.
4. SSE - called Explained sum of square error , square of difference between predicted
value and mean value.
SSE = ∑(Ypredicted-Ymean)2∑(Ypredicted−Ymean)2.
5. SST - Total sum of square error , square of difference between actual value and
mean value.
SST = ∑(Yactual-Ymean)2∑(Yactual−Ymean)2.
6. SSR - Residual , square of difference between predicted value and actual value.
SSR = ∑(Ypredicted-Yactual)2∑(Ypredicted−Yactual)2
7. R2R2 - called rsquare , used to determine whether the curve is good fit or not ,how
how much the regressand depends on taken regessands , predict to take
more regressands if required.
limitation :
If more regressors considered Rsquare value increases whether the regressor
is useful or not.
R2=SSESSTR2=SSESST
8. Adjusted R2R2 - called Adjusted Rsquare , overtake the limitation in Rsquare ,
also called upgraded Rsquare.
Adjusted R2R2 = 1 - [(1-R2)(n-1)n-k-1(1−R2)(n−1)n−k−1]
R2R2 - Rsquare value.
n - No.of data points.
k - No.of parameters / regressors.
Some of regressor metrices :
1. MAE - Mean absolute error , if MAE = 0 , means the model is perfectly fit with data,
but it won't be accurate while testing for another data.
MAE = ∑ni=1|Yactual-Ypredicted|2n∑ni=1|Yactual−Ypredicted|2n.
2. MSE - Mean square error , using this method careful not to considered the outliers.
MSE = ∑ni=1|Yactual-Ypredicted|2n∑ni=1|Yactual−Ypredicted|2n.
3. RMSE - Root mean square error , standard deviation of residual.
descriebs , predicts how large the residual being dispersed.
RMSE = √∑ni=1(Yactual-Ypredicted)2n√∑ni=1(Yactual−Ypredicted)2n.
.The accuracy of fitting increases by increasing the degree of ploynomial , but it upto certain
degree , so that fitting won't become 'overfit'.
.'Best fit' is good fit for fitting the curve , we won't get much variance while testing with
other data.
Polynomial used for curve fitting :
1.Linear fitting:
General used polynomial for linear fitting - Y = ax+b.ax+b.
a - slope of the line.
b - Y-intercept.
2.Quadratic Polynomial :Y = ax2+bx+c.ax2+bx+c. , used this polynomial for quadratic fitting.
3.Cubic Polynomial : Y = ax3+bx2+cx+d.ax3+bx2+cx+d. , used this polynomial for cubic fitting.
Similar equations by increasing the order.
General equation of polynomial : axn+bxn-1+cxn-2+-----+dx+e.axn+bxn−1+cxn−2+−−−−−+dx+e.
n - order of the polynomial.
Inbuilt Matlab functions used for fitting :
1. polyfit(f,g,n) - returns the coeffcients of best curve fit of polynomial order 'n' , by using least square method .
f,g - smaple data .
2.polyval(p,f) - gives the predicted values of the 'n' order polynomial for the respective value in array 'f'.
code snippet :
close all ; clear all ; clc ;
f = load('curvefitfile');
g = f(:,1);
h = f(:,2);
%used sample of data for training.
% linear regression
[p1] = polyfit(f(:,1) , f(:,2) , 1);
q1 = polyval(p1,f(:,1));
% quadratic regression
[p2] = polyfit(f(:,1),f(:,2) ,2);
q2 = polyval(p2,f(:,1));
% cubic regression
[p3] = polyfit(f(:,1),f(:,2) ,3);
q3 = polyval(p3,f(:,1));
% 4th degree regression
[p4] = polyfit(f(:,1),f(:,2) ,4);
q4 = polyval(p4,f(:,1));
% 5th degree regression
[p5] = polyfit(f(:,1),f(:,2) ,5);
q5 = polyval(p5,f(:,1));
% 6th degree regression
[p6] = polyfit(f(:,1),f(:,2) ,6);
q6 = polyval(p6,f(:,1));
% 7th degree regression
[p7] = polyfit(f(:,1),f(:,2) ,7);
q7 = polyval(p7,f(:,1));
%q = [[q1] , [q2] , [q3] , [q4] ];
color_value = ['r','b','y','m','c','r','b'];
%a = ['linear_fitting','quadratic_fitting','cubic_fitting','4th_order_fitting'];
%%
y_mean1 = mean(q1);
%sse =
for j = 1:7
i = 1:100:length(f(:,1));
plot(f(i,1),f(i,2),'marker','*')
hold on
if j == 1
plot(f(:,1),q1,'color',color_value(j))
legend('linear fitting')
end
if j ==2
plot(f(:,1),q2,'color',color_value(j))
legend('quadratic fitting')
end
if j == 3
plot(f(:,1),q3,'color',color_value(j))
legend('cubic fitting')
end
if j==4
plot(f(:,1),q4,'color',color_value(j))
legend('4th order fitting')
end
if j==5
plot(f(:,1),q5,'color',color_value(j))
legend('5th order fitting')
end
if j==6
plot(f(:,1),q6,'color',color_value(j))
legend('6th order fitting')
end
if j==7
plot(f(:,1),q7,'color',color_value(j))
legend('7th order fitting')
end
xlabel('temperature')
ylabel('cp value')
title('tempearature vs cp ')
%drawnow
pause(3)
if j~=7
clf
end
end
Above code snippet shows on increasing order of the polynomial , observed accuracy of fitting increases refer below figures.
Explanation :
1. Used genral commands to clear the data at present.
2. Loaded the data in 'curvefitfile' into variable f.
3.used polyfit and polyval command to get the required values for different degrees , explained commands above.
4.using for and if loop plotted the garph with the help of plot command , observe the flow of code.
5.pause(x) - pause the execution for x seconds.
6.clf - to clear the present figure.
7.xlabel , ylabel ,title for labelling purpose.
code snippet for Rsquare , SSE , SST.
sample of code
y_mean1 = mean(f(:,2));
% SSE calculated for all fitting ,
SSE = sum((q7 - y_mean1).^2);
SST = sum((f(:,2) - y_mean1).^2);
Rsquare = SSE/SST;
Step explanation of code :
this code snippet used to caclulate Rsquare value , one of the parameter which describes the accuracy of fitting.
1. genral steps calauted thigns by using respective formuals , explained in theory.
2.Rsquare values shown in below for different degree of polynimial.
Observed that Rsquare value increases on increasing order of polynomial , but it won't be a good model.
Model selection criterias , tells that optimal ranges of good fitting that depedns on parameters , likekyhood.
Model selection criterias:
1.AIC.
2.BIC.
Splitwise fitting :
close all ; clear all ; clc ;
data = load('curvefitfile');
c=1 ; i = 1;
% spillting data aray into 8 parts
j = [400 , 800 , 1200 , 1600 , 2000 , 2400 , 2800 , 3199];
for m = 1:length(data)/400
d{c} = data(i:j(m),1:2);
data_up(c) = d(c);
c = c + 1;
i=j(m)+1;
i
end
% deploying data from cell array
%[a1] = data_up(1);
a1 = [data_up{1,1}];
a2 = [data_up{1,2}];
a3 = [data_up{1,3}];
a4 = [data_up{1,4}];
a5 = [data_up{1,5}];
a6 = [data_up{1,6}];
a7 = [data_up{1,7}];
a8 = [data_up{1,8}];
p1 = polyfit(a1(:,1),a1(:,2),1);
q1 = polyval(p1,a1(:,1));
p2 = polyfit(a2(:,1),a2(:,2),1);
q2 = polyval(p2,a2(:,1));
p3 = polyfit(a3(:,1),a3(:,2),1);
q3 = polyval(p3,a3(:,1));
p4 = polyfit(a4(:,1),a4(:,2),1);
q4 = polyval(p4,a4(:,1));
p5 = polyfit(a5(:,1),a5(:,2),1);
q5 = polyval(p5,a5(:,1));
p6 = polyfit(a6(:,1),a6(:,2),1);
q6 = polyval(p6,a6(:,1));
p7 = polyfit(a7(:,1),a7(:,2),1);
q7 = polyval(p7,a7(:,1));
p8 = polyfit(a8(:,1),a8(:,2),1);
q8 = polyval(p8,a8(:,1));
plot(data(:,1),data(:,2),'color','r')
hold on
plot(a1(:,1),q1,'color','b')
plot(a2(:,1),q2,'color','m')
plot(a3(:,1),q3,'color','c')
plot(a4(:,1),q4,'color','y')
plot(a5(:,1),q5,'color','g')
plot(a6(:,1),q6,'color','b')
plot(a7(:,1),q7,'color','m')
plot(a8(:,1),q8,'color','y')
xlabel('Temperature [K]')
ylabel('Specific heat at constant pressure (\bf{C}_{\bfp})')
title('splitwise fitting')
Step explanation of code:
1. Used genral commands to clear the data at present.
2. Loaded the data in 'curvefitfile' into variable f.
3.Stored the data in cell array by splitting , using for loop.
4.accesed data ,used polyfit and polyval command to get the required details , and plotted using plot command.
5.xlabel , ylabel ,title for labelling purpose.
Result by using splitwise fitting:
Using 'cftool' in matlab :
Observed the Goodness of fit values , on increasing the order of polynomial , geeting good fit , use center and scaling option if got an error , the ' curve fitting is in bad .condition'.
Best fit : Best fit is the good fitting for to make predictions with any diffetent data by the model.
Good Consistent.
Perfect fit : also called overfitting , this is not the good fitting as it gives good predictions for
the given data only , for other data model is more sensitive.
Inconsistent.
How to Get best fit :
Can say whether fitting is best or not by regression metrices , polynomial is upto optimal
range only , can get that information model selection criteria.
How to make the curve perfectly :
1.Simply increase the order of polynomial.
2. Use splitwise method.
Both methods explained above.
Using Basic fitting option:
for this on output , go to tools>basic fitting.
Conclusion :
1.Curve fitting done by using matlab commands .
2.Analysed the curves , by goodness of fit , and got information about parameters in regression analysis.
Leave a comment
Thanks for choosing to leave a comment. Please keep in mind that all the comments are moderated as per our comment policy, and your email will not be published for privacy reasons. Please leave a personal & meaningful conversation.
Other comments...
Project 4
Objective 1. Detect lanes using Hough Transform Method. Steps in lane Detection 1. Detect Edges using Canny Edge Detection Method. 2. Capture the Region of Interest in Image. 3. Detect lines using Hough Transform. 4. Calculate the actual left and right lanes from all lines. 6. Find the coordinates of the lines. 7. Display…
21 Feb 2023 05:22 AM IST
Lane Detection using Hough Transform
Objective 1. Detect lanes using Hough Transform Method. Steps in lane Detection 1. Detect Edges using Canny Edge Detection Method. 2. Capture the Region of Interest in Image. 3. Detect lines using Hough Transform. 4. Calculate the actual left and right lanes from all lines. 6. Find the coordinates of the lines. 7. Display…
21 Feb 2023 05:20 AM IST
Edge Detection and Hough Transform
Objective: Detecting Edges using Canny Edge Detector. Understand and display the outputs of all steps in the canny Edge Detection process. Effect on the output on changing the parameters of the Non-Maximum Suppression. Understand Hough Transform and Lane Detection using Hough Transform. Steps Follow in the Discussion:…
28 Dec 2022 09:58 PM IST
week 2 : Image Filtering
Objective: Apply Prewitt and Laplacian filter on the image Libraries used: Opencv Open Source computer vision library. cross-platform supported. can perform through c, c++, and python. numpy Mathematical library to work in the domain of Linear algebra, Fourier transform, and matrices. Image Filtering…
09 Jul 2022 06:36 AM IST
Related Courses
0 Hours of Content
Skill-Lync offers industry relevant advanced engineering courses for engineering students by partnering with industry experts.
© 2025 Skill-Lync Inc. All Rights Reserved.