All Courses
All Courses
Courses by Software
Courses by Semester
Courses by Domain
Tool-focused Courses
Machine learning
POPULAR COURSES
Success Stories
Introduction: Curve fitting is the process of introducing mathematical relationships between dependent and independent variables in the form of an equation for a given set of data. …
Khaled Ali
updated on 01 Dec 2019
Introduction:
Curve fitting is the process of introducing mathematical relationships between dependent and independent variables in the form of an equation for a given set of data.
Fig. Curve Fitting Example
The goal is to identify the coefficients ‘a’ and ‘b’ such that f(x) ‘fits’ the data well
Perfect fit & Best fit:
“Best fit” simply means that the differences between the actual measured Y values and the Y values predicted by the model equation are minimized. It does not mean a "perfect" fit; in most cases, a least-squares best fit does not go through all the points in the data set. Above all, a least-squares fit must conform to the selected model - for example, a straight line or a quadratic parabola - and there will almost always be some data points that do not fall exactly on the best-fit line, on the other hand “perfect-fit’’ implies that the model fits the data completely without any errors (R^2=1.0).
How to Evaluate Goodness of Fit?
Evaluation of goodness of fit is done through these statistical parameters:
Sum of Squares Due to Error:
This statistic measures the total deviation of the response values from the fit to the response values. It is also called the summed square of residuals and is usually labeled as SSE.
SSE=n∑i=0(Y(i)−f(x(i)))2
A value closer to 0 indicates that the model has a smaller random error component, and that the fit will be more useful for prediction.
R-Square:
It measures the proportion of the variation in your dependent variable explained by all of your independent variables in the model. It assumes that every independent variable in the model helps to explain variation in the dependent variable. In reality, some independent variables (predictors) don't help to explain dependent (target) variable. In other words, some variables do not contribute in predicting target variable.
Mathematically, R-squared is calculated by dividing sum of squares of residuals (SSE) by total sum of squares (SST) and then subtract it from 1. In this case, SST measures total variation. SSR measures explained variation and SSE measures unexplained variation.
R-square is defined as the ratio of the sum of squares of the regression (SSR) and the total sum of squares (SST). SSR is defined as
SSR=n∑i=0(f(x(i))−Mean)2
SST is also called the sum of squares about the mean, and is defined as
SST=n∑i=0((Y(i))−Mean)2
SST = SSR + SSE. Given these definitions, R-square is expressed as
R2=SSRSST
R-Squared lies between 0% and 100%. A r-squared value of 100% means the model explains all the variation of the target variable. And a value of 0% measures zero predictive power of the model. Higher R-squared value, better the model.
Adjusted R-square:
It measures the proportion of variation explained by only those independent variables that really help in explaining the dependent variable. It penalizes you for adding independent variable that do not help in predicting the dependent variable.
Adjusted R-Squared can be calculated mathematically in terms of sum of squares. The only difference between R-square and Adjusted R-square equation is degree of freedom.
R2=1−(1−R2)(N−1)N−P−1
R2…..sample R^2
P……..Number of predictors
N……Total sample size
Difference between R-square and Adjusted R-square
Every time you add a independent variable to a model, the R-squared increases, even if the independent variable is insignificant. It never declines. Whereas Adjusted R-squared increases only when independent variable is significant and affects dependent variable.Adjusted r-squared can be negative when r-squared is close to zero.Adjusted r-squared value always be less than or equal to r-squared value.So adjusted R-square should be used to compare models with different numbers of independent variables. Adjusted R-square should be used while selecting important predictors (independent variables) for the regression model.
Root Mean Squared Error
This statistic is also known as the fit standard error and the standard error of the regression. It is an estimate of the standard deviation of the random component in the data, and is defined as
RSME=√SSEn
where n is the number of datapoints avaliable.
Improving the Goodness of fit
Any model that result in R squared greater than 0.95 is considered a good fit. In order to reach this value and improve upon the current value, the order of polynomial should be increased or split-wise method and fits each potion separately and then joins them.
How to make a curve fit perfectly?
A perfect fit means the obtained curve should fit the original curve without any errors in a particular degree of a polynomial. So the curve fit can be obtained by checking it with different degrees of a polynomial. In this case, the ninth-degree polynomial is considered which is found to be perfect.
How to get the best fit?
Best Fit can be obtained by increasing the order of the polynomial or by splitwise method in order to minimize the diviation.So a perfect fit is best fit But, a best fit cannot be a perfect fit.so we can use a parameter like R^2 and other parameters to check the goodness of the fit and to compare different fits in order to determine the best fit (a good fit is anything with R^2 value greater than 0.95).
What could be done to improve the cubic fit?
For a particular cubic fit, the way of improving it is by splitting the curve into small parts and then fit and join them.
Curve fitting polynomial equations
f(x)=p1⋅x+p2
f(x)=p1⋅x2+p2⋅x+p
f(x)=p1⋅x3+p2⋅x2+p3⋅x+p4
f(x)=p1⋅x9+p2⋅x8+p3⋅x7+p4⋅x6+p5⋅x5+p6⋅x4+p7⋅x3+p8⋅x2+p9⋅x+p10
In order to reach a prefect fit(R^2=1) first we tried to increase the degree of polynomial which increased from one to nine,and the R^2 value changed from 0.9249 for the linear fit and then to 0.9812 for the quadratic fit and then to 0.9967 for the cubic fit and then to 1.0 for the ninth degree polynomial fit.
Increasing order of plynomial code
clear all
close all
clc
cp_data=load('data')
temperature=cp_data(:,1)
cp=cp_data(:,2)
n=3200
%curve fit
%cp=at+b(for linear function)
%cp=a*t*2+b*t+c %for quadratic function
co_eff_1=polyfit(temperature,cp,1)%assumed as linear equation
predicted_cp1=polyval(co_eff_1,temperature)
co_eff_2=polyfit(temperature,cp,2) %2nd degree polynomial
predicted_cp2=polyval(co_eff_2,temperature)
co_eff_3=polyfit(temperature,cp,3) %3rd degree polynomial
predicted_cp3=polyval(co_eff_3,temperature)
co_eff_9=polyfit(temperature,cp,9) %9th degree polynomial
predicted_cp9=polyval(co_eff_9,temperature)
%comparing
plot(temperature,cp,'linewidth',3,'color','g')
hold on
plot(temperature,predicted_cp1,'linewidth',3,'color','y')
hold on
plot(temperature,predicted_cp2,'linewidth',3,'color','b')
hold on
plot(temperature,predicted_cp3,'linewidth',3,'color','r')
hold on
plot(temperature,predicted_cp9,'linewidth',3,'color','c')
grid on
xlabel('temperature[k]')
ylabel('specific heat[kj/kmol-k]')
title('linear vs. quadratic vs. cubic vs. ninth degree polynomial fit')
%checking goodness of fit
%for linear polynomial
error1=cp-predicted_cp1
sse_linear=sum(error1.^2)
ssr_linear=sum((predicted_cp1-mean(cp)).^2)
sst_linear=sse_linear+ssr_linear
rsquare_linear=ssr_linear./sst_linear
rmse_linear=sqrt(sse_linear/n)
error2=cp-predicted_cp2
sse_quad=sum(error2.^2)
ssr_quad=sum((predicted_cp2-mean(cp)).^2)
sst_quad=sse_quad+ssr_quad
rsquare_quad=ssr_quad./sst_quad
rmse_quad=sqrt(sse_quad/n)
error3=cp-predicted_cp3
sse_cubic=sum(error3.^2)
ssr_cubic=sum((predicted_cp3-mean(cp)).^2)
sst_cubic=sse_cubic+ssr_cubic
rsquare_cubic=ssr_cubic./sst_cubic
rmse_cubic=sqrt(sse_cubic/n)
error9=cp-predicted_cp9
sse_ninth_degree=sum(error9.^2)
ssr_ninth_degree=sum((predicted_cp9-mean(cp)).^2)
sst_ninth_degree=sse_ninth_degree+ssr_ninth_degree
rsquare_ninth_degree=ssr_ninth_degree./sst_ninth_degree
rmse_ninth_degree=sqrt(sse_ninth_degree/n)
Code explanation
Output
Results
splitwise curve fit
After reaching perfect fit using increasing the order polynomial method,we then decided to use the other way which is splitting the plot into eight equal portions and and then fit every portion and then combine them together.It was decided the cubic fit should be the basis for this interpolation as it already had a very good R^2 valueof 0.99.This resulted in the same perfect fit achieved with the ninth degree polynomial (R^2=1.0).
Splite wise curve fit code
clear all
close all
clc
cp_data=load('data')
temperature=cp_data(:,1)
cp=cp_data(:,2)
n=3200
t1=cp_data(1:400,1)
cp1=cp_data(1:400,2)
co_eff1=polyfit(t1,cp1,3)
predicted_cp1=polyval(co_eff1,t1)
t2=cp_data(401:800,1)
cp2=cp_data(401:800,2)
co_eff2=polyfit(t2,cp2,3)
predicted_cp2=polyval(co_eff2,t2)
t3=cp_data(801:1200,1)
cp3=cp_data(801:1200,2)
co_eff3=polyfit(t3,cp3,3)
predicted_cp3=polyval(co_eff3,t3)
t4=cp_data(1201:1600,1)
cp4=cp_data(1201:1600,2)
co_eff4=polyfit(t4,cp4,3)
predicted_cp4=polyval(co_eff4,t4)
t5=cp_data(1601:2000,1)
cp5=cp_data(1601:2000,2)
co_eff5=polyfit(t5,cp5,3)
predicted_cp5=polyval(co_eff5,t5)
t6=cp_data(2001:2400,1)
cp6=cp_data(2001:2400,2)
co_eff6=polyfit(t6,cp6,3)
predicted_cp6=polyval(co_eff6,t6)
t7=cp_data(2401:2800,1)
cp7=cp_data(2401:2800,2)
co_eff7=polyfit(t7,cp7,3)
predicted_cp7=polyval(co_eff7,t7)
t8=cp_data(2801:3200,1)
cp8=cp_data(2801:3200,2)
co_eff8=polyfit(t8,cp8,3)
predicted_cp8=polyval(co_eff8,t8)
%plotting original data
plot(temperature,cp,'linewidth',3,'color','g')
hold on
%plotting splitted curves
predicted_cp=[predicted_cp1;predicted_cp2;predicted_cp3;predicted_cp4;predicted_cp5;predicted_cp6;predicted_cp7;predicted_cp8]
plot(temperature,predicted_cp,'linewidth',3,'color','y')
for i=length(cp)
error=cp(i)-predicted_cp(i)
sse_cubic=sum(error.^2)
ssr_cubic=sum(predicted_cp(i)-mean(cp))
end
sst_cubic=ssr_cubic+sse_cubic
rsquare_cubic_splitwise=ssr_cubic/sst_cubic
rsme_cubic=sqrt(sse_cubic/n)
xlabel('temperature[k]')
ylabel('specific heat[kj/kmol-k]')
title('split-wise cubic curve fit')
grid on
Code explaination
Output
Results
Errors in code:
Punctuation errors happened mainly when assigning different names to different parts of the code because of the code length.All the errors were then corrected.
Reference:
https://terpconnect.umd.edu/~toh/models/CurveFitting.html
https://projects.skill-lync.com/projects/Curve-Fitting-and-criterion-to-choose-best-fit-41860
Leave a comment
Thanks for choosing to leave a comment. Please keep in mind that all the comments are moderated as per our comment policy, and your email will not be published for privacy reasons. Please leave a personal & meaningful conversation.
Other comments...
ADAMS - Project - Hardpoint tuning to achieve pitch gradient targets using suspension “anti” characteristics
please see project at: https://skill-lync.com/student-projects/adams-project-hardpoint-tuning-to-achieve-pitch-gradient-targets-using-suspension-anti-characteristics-10
30 Jun 2021 04:20 AM IST
ADAMS - Project - Hardpoint tuning to achieve pitch gradient targets using suspension “anti” characteristics
Project Deliverables Perform hardpoint tuning to achieve the target acceleration and braking pitch gradients for Double Wishbone Suspension at the front and Multilink Suspension at the rear. Adams Plot showing effect of tuning (overlay curves). Completed Excel Sheet. Project targets and coonstraints: Introduction: Pitch…
30 Jun 2021 04:15 AM IST
Adams Assignments – Problem 5 – Compliance
please see project at: https://skill-lync.com/projects/adams-assignments-problem-5-compliance-11
30 Jun 2021 01:54 AM IST
Adams Assignments – Problem 5 – Compliance
Adams sign convention: Steps: -First rear multilink suspension is created -then static load simulation is made -first brake static load analysis is performed with 5000 newton braking force -then x axis input variable was chosen which wheel base travel -then y-axis input variable was chosen as toe angle -then the brake…
30 Jun 2021 01:52 AM IST
Related Courses
0 Hours of Content
Skill-Lync offers industry relevant advanced engineering courses for engineering students by partnering with industry experts.
© 2025 Skill-Lync Inc. All Rights Reserved.