All Courses
All Courses
Courses by Software
Courses by Semester
Courses by Domain
Tool-focused Courses
Machine learning
POPULAR COURSES
Success Stories
AIM Write a program to suit a linear and cubic polynomial for the Cp data and measure the fitness characteristics for both the curves. OBJECTIVES · A MATLAB code to suit a Linear and Cubic Polynomial for the given DATA, · Plot the comparison…
Sourabh Lakhera
updated on 24 May 2020
AIM
Write a program to suit a linear and cubic polynomial for the Cp data and measure the fitness characteristics for both the curves.
OBJECTIVES
· A MATLAB code to suit a Linear and Cubic Polynomial for the given DATA,
· Plot the comparison curve between both of them to get best fit curve.
· Measure the accuracy and precision in the fit curve with respect to original data curve
· Finally measure all the errors.
Theory :-
What is Curve Fitting ?
Curve fitting is that the method of constructing a curve, or mathematical functions, which possess the closest proximity to the important series of data . By curve fitting, we'll mathematically construct the functional relationship between the observed dataset and parameter values, etc. it's highly effective within the mathematical modeling of some natural processes.
Types of curve fitting include:
• Interpolation, where you discover a function that's a specific fit the data points. Since this assumes no measurement error, it's limited applicability to world scenarios.
• Smoothing is once we discover a function that's an approximate fit the data points, but we give room for error which we allow our actual points to be near, but not necessarily on the line; given the error is minimized overall.
Linear and Polynomial Curve fitting :
Linear curve fitting, or rectilinear regression , is when the info is fit a line . Although there could be some curve to your data, a line provides an inexpensive enough fit make predictions.
Since the equation of a generic straight line is always given by f(x)= a x + b, the question becomes: w hat a and b will give us the best fit line for our data?
Considering the vertical distance from each point to a prospective line as an error , and summing them up over our range, gives us a concrete number that expresses how away from ‘best’ the potential line is. A line that provides a minimum error are often considered the only line.
Since it’s the space from our points to the road we’re interested in—whether it's positive or negative distance isn't relevant—we square the space in our error calculations. This also allows us to weight greater errors more heavily. So this method is known as the littlest amount square approach.
Polynomial curve fitting is once we fit our data to the graph of a polynomail function. the same method of method of least squares method are often used to find the polynomial, of a given degree, that features a minimum total error.hat a and b will give us the best fit line for our data.
To choose the best fit for the curve the following four parameters help us to measure the goodness of fit criteria or how well the equations are representing the given datapoints:
1. The sum of squares due to error (SSESSE),
2. R-square (R2R2),
3. Adjusted R-squrae,
4. Root mean squared error (RMSERMSE)
EQUATIONS
Curve Fitting
Polynomial curve fitting is when we fit our data to the graph of a polynomial function.
p(x): -
p(x)=p1⋅xn+p2⋅xn−1+...+pn⋅x+p(n+1)">p(x)=p1⋅xn+p2⋅xn−1+...+pn⋅x+p(n+1)p(x)=p1⋅xn+p2⋅xn−1+...+pn⋅x+p(n+1)ecommerce
Measure the fitness characteristics of the curves.
Error = |y(i)-f(x(i))|">Error = |y(i)−f(x(i))|Error = |y(i)-f(x(i))|
Here,
y(i)=Data at point i">y(i)=Data at point iy(i)=Data at point i
f(x(i))=Data at polynomial function point i">f(x(i))=Data at polynomial function point if(x(i))=Data at polynomial function point i
The sum of squares due to error (SSE)
SSE(sum of square errors)=∑i=1nError(i)2">SSE(sum of square errors)=n∑i=1Error(i)2SSE(sum of square errors)=∑i=1nError(i)2
R-square
SSR=∑i=1n(f(x(i))-Mean)2">SSR=n∑i=1(f(x(i))−Mean)2SSR=∑i=1n(f(x(i))-Mean)2
SSE=∑i=1n(y(i)-f(x(i)))2">SSE=n∑i=1(y(i)−f(x(i)))2SSE=∑i=1n(y(i)-f(x(i)))2
SST=SSR+SSE">SST=SSR+SSESST=SSR+SSE
R2=SSRSST">R2=SSRSSTR2=SSRSST
Adjusted R-square
R2_(adj)=1-[(1-R2)⋅(n-1)n-k-1]">R2_(adj)=1−[(1−R2)⋅(n−1)n−k−1]R2_(adj)=1-[(1-R2)⋅(n-1)n-k-1]
where,
n=Number of datapoints">n=Number of datapointsn=Number of datapoints
k=Number of independent variables">k=Number of independent variablesk=Number of independent variables
Root mean squared error (RMSE)
RMSE=∑i=1n(y(i)-f(x(i)))2n=SSE2">RMSE (root mean squared error) = `sqrt((SSE)/n)`
MATLAB CODE :-
clear all
close all
clc
% attaching file named "DATA"
cp_data = load('data.txt');
temperature = cp_data(:,1);
cp = cp_data(:,2);
legendInfo{1} = 'Original dataset';
% VARIATION OF POLYNOMIAL DEGREE OF EQUATION
max_poly_dgree = 5;
%Lets have 'for' loop for different degrees of polynomials
for poly_dgree = 1:max_poly_dgree
% use polyfit to fit curve, with the degree of polynomial
co_effs = polyfit(temperature,cp,poly_dgree);
% use of polyval to get predicted data,for the ploting
predicted_cp(:,poly_dgree) = polyval(co_effs, temperature);
%Let assume a variable q for the simplicity;
q = predicted_cp(:,poly_dgree);
%now different values of set of error will be called off from the
%sub-function called error_in_fitting
[SSE,SSR,SST,R_squr,R_squrAdj,RMS] = error_in_fitting(cp,q,1,length(cp));
%then for different set of values of error must be assigned to variable
fitError(:,poly_dgree) = [SSE ; R_squr; R_squrAdj ; RMS];
legendInfo{poly_dgree+1} = ['FitDegree ' num2str(poly_dgree)]; % legendInfo Strings
end
%[SSE,SSR,SST,R_squr,R_squrAdj,RMS] = error_in_fitting(cp , z , w ,n)
% plot curve fit b/w different degree of curve to compare with the actual
% curve
figure(1)
set(gcf,'position',[10,10,900,600])
plot(temperature, cp, 'linewidth',3,'color','r');
hold on
plot(temperature, predicted_cp, 'linewidth',1);
xlabel('Temperture in [K]');
ylabel('Specific Heat in [KJ/Kmol-K]');
title('Curve fitting of Cp data');
legend(legendInfo,'location','northwest');
% Lets Plot different curve on the for different types of errors
for poly_dgree = 1:max_poly_dgree
q1 = fitError(1,poly_dgree) ;
q2 = fitError(2,poly_dgree) ;
q3 = fitError(3,poly_dgree) ;
q4 = fitError(4,poly_dgree) ;
% 'Error calculatio by sum of square method 1'
figure(2)
set(gcf,'position',[10,10,900,600])
hold on
plot(poly_dgree, q1,'b*');
text(poly_dgree, q1 ,legendInfo{1,poly_dgree+1},'VerticalAlignment','bottom','HorizontalAlignment','center');
xlabel('Degree of Polynomial');
ylabel('Sum of squares Error (SSE)');
title('Sum of squares Error (SSE)');
% 'Error calculatio by R-square METHOD 2'
figure(3)
set(gcf,'position',[10,10,900,600])
hold on
plot(poly_dgree, q2,'c*');
text(poly_dgree, q2,legendInfo{1,poly_dgree+1},'VerticalAlignment','top','HorizontalAlignment','center');
xlabel('Degree of Polynomial');
ylabel('R-square Error ');
title('R-square Error');
% 'Error calculatio by Adjusted R-square METHOD 3'
figure(4)
set(gcf,'position',[10,10,900,600])
hold on
plot(poly_dgree, q3,'k*');
text(poly_dgree, q3,legendInfo{1,poly_dgree+1},'VerticalAlignment','bottom','HorizontalAlignment','center');
xlabel('Degree of Polynomial');
ylabel('Adjusted R-square Error ');
title('Adjusted R-square Error');
% 'Error calculatio by Root mean squared error (RMSE)'
figure(5)
set(gcf,'position',[10,10,900,600])
hold on
plot(poly_dgree, q4,'m*');
text(poly_dgree, q4,legendInfo{1,poly_dgree+1},'VerticalAlignment','bottom','HorizontalAlignment','center');
xlabel('Degree of Polynomial');
ylabel('Root mean squared Error (RMSE)');
title('Root mean squared Error (RMSE)');
end
% Display plot by order and save as .jpj
figure(5)
saveas(gcf,strcat('Root mean squared error (RMSE)','.jpg'))
figure(4)
saveas(gcf,strcat('Adjusted R-square','.jpg'))
figure(3)
saveas(gcf,strcat('R-square','.jpg'))
figure(2)
saveas(gcf,strcat('Sum of squares due to error (SSE)','.jpg'))
figure(1)
saveas(gcf,strcat('Curve fitting of Cp data','.jpg'))
RMSE=∑i=1n(y(i)-f(x(i)))2n=SSE2">
Function : -
function [SSE,SSR,SST,R_squr,R_squrAdj,RMS] = error_in_fitting(cp , z , w ,n)
% Sum of squares due to error (SSE)
SSE = sum((cp - z).^2);
SSR = sum((z - mean(cp)).^2);
SST = SSR + SSE;
% R-square
R_squr = SSR/SST;
% Adjusted R-square
R_squrAdj = ((1 - R_squr)*((n-1)/(n-w-1)));
% Root mean square error (RMSE)
RMS = sqrt(SSE/n);
end
Explanation : -
Step 1: Load Cp data from file 'data' into the MATLAB workspace.
Step 2 : - From the data obtained, two individual column matrices of temperature and specific heat are derived.
Step 3 : - Now, we will start to calculate the linear equation coefficients and calculating the predicted value of Cp. The same step is applied for calculating Cp in case of squared and cubic. Solve the fit data into polynomials using MATLAB polyfit function.
Step 4 : - Get the predicted data polyfit coefficients using MATLAB polyval function.
Step 5 :- Plots of both raw dataset and predicted dataset are made and compared to find fitness characteristics for both the curves for both linear and cubic polynomial.
Step 6 : - To measure the fitness characteristics for both the curves, various parameters such as SSE,SSR,SST,R2R2,RMSE are calculated. The plot which has highest R2R2 or closest to 1 is considered to be good fit.
Error : -
No major errors occured except a few typos, which were rectified immediately.
There was a warning notification in the command window which stated:
It is because ,p= polyfit(
returns the coefficients for a polynomial x
,y
,n
)p(x)
of degree n
that is a best fit (in a least-squares sense) for the data in y
. The coefficients in p
are in descending powers, and the length of p
is n+1.
Error 2 : -
error due to main fucntion can't call the function :
Results :-
Based on the above work, we can answer the following questions:
Q. How to make a curve fit perfectly?
A curve fits perfectly when the values of R2 is almost or adequate to 1. When it's exactly adequate to 1, it means the given raw dataset completely overlaps to the dataset calculated and during this case, error in fit is 0. If the order of polynomial increases. Both the curves start to overlap one another . Hence, on increasing order of polynomial the curve starts to suit completely.
Q. How to get the best fit?
To work out the simplest fit, we should always examine both the graphical and numerical fit results of the varied degrees of fits (linear,quadratic etc.,) obtained for the info , the fit which has minimum SSE value and therefore the Maximum R square value in comparison with other fits is termed because the best fit.
Q . What could be done to improve the cubic fit?
Cubic fit are often improved by normalizing the data by selecting the center and scale checkbox.
CONCLUSION : Hence we can conclude that to form curve fit good R2R2 should be on the brink of 1. It are often found from the results obtained as for linear curve fit R2R2 is approximately 0.93 and for cubic curve fit it is nearly 0.99. So It are often said that cubic polynomial fits better than linear polynomial. If the order of the polynomial increases, the worth of R2R2 will inch towards 1 and fit are going to be assumed to be good.
REFERENCES :
1 . How to Evaluate Goodness of Fit
2.polyfit
3.Plotting and Analysing Residuals
Leave a comment
Thanks for choosing to leave a comment. Please keep in mind that all the comments are moderated as per our comment policy, and your email will not be published for privacy reasons. Please leave a personal & meaningful conversation.
Other comments...
Project 1 - Data Cleaning and Transformation using Pivot table and Charts for the Report on requirement of Team Hiring
Project Documentation: Optimizing Team Hiring Insights through Data Cleaning and TransformationIntroductionIn today's data-driven environment, businesses thrive on informed decision-making. At ABC Private Limited, a manufacturer and seller of diverse products ranging from Classic Cars to Trucks and Buses, understanding…
26 Sep 2024 02:54 PM IST
Project 2
Project Documentation: Alumni Career Choices AnalysisAimThe aim of this project is to analyze the career choices of alumni from two universities with respect to their passing year and the courses they completed. This analysis helps in understanding the career growth of alumni, which plays a crucial role in the institute's…
10 Jul 2024 08:03 AM IST
Project 1
From the series of queries and actions conducted on the database, several insights can be derived regarding the user demographics and their activities on the platform. Firstly, the database contains information about users' demographics, such as their gender, country, language, and usage of applications like…
28 Feb 2024 07:45 PM IST
Project 2 - EDA on Vehicle Insurance Customer Data
EDA on Vehicle Insurance Customer Data Aim: The aim of this project is to perform exploratory data analysis (EDA) and data cleaning on two datasets containing customer details and policy details. The objective is to prepare the data for future analysis and modeling, identify patterns, and derive insights to aid business…
19 Feb 2024 08:36 PM IST
Related Courses
Skill-Lync offers industry relevant advanced engineering courses for engineering students by partnering with industry experts.
© 2025 Skill-Lync Inc. All Rights Reserved.