Regression – Least Square Fittings
This brief article will demonstrate how to work out
regressions in Matlab (also known as polynomial least
fittings). The idea is to find the polynomial function that properly
given set of data points.
this purpose, we’re going to use two useful built-in functions: polyfit
fitting polynomial to data) and polyval (to evaluate polynomials).
the simplest way to use these functions:
p = polyfit(x,
y, n) finds the coefficients of a polynomial p(x) of degree n that fits
data y best in a least-squares sense.
p is a row vector of length n
containing the polynomial coefficients in descending powers, p(1)*x^n +
- 1) + ... + p(n)*x + p(n + 1).
y = polyval(p,
x) returns the value of a polynomial p evaluated at x. p is a vector of
n + 1 whose elements are the coefficients of the polynomial in
powers, y = p(1)*x^n + p(2)*x^(n - 1) + ... + p(n)*x + p(n + 1). If x
matrix or vector, the polynomial is evaluated at all points in x.
say that we’re given these points
x = [1 2 3 4 5.5 7 10]
y = [3 7 9 15 22 21 21]
want to explore fits of 2nd., 4th. and 5th. order. We could use the
clear all, clc,
close all, format compact
x = [1 2
3 4 5.5 7 10];
y = [3 7
9 15 22 21 21];
plot given data in red
y, 'ro', 'linewidth', 2)
find polynomial fits of different orders
polyfit(x, y, 2)
polyfit(x, y, 4)
polyfit(x, y, 5)
see interpolated values of fits
xc = 1 :
.1 : 10;
plot 2nd order polynomial
plot 4th order polynomial
y4, 'linewidth', 2)
plot 5th order polynomial
y5, 'k.', 'linewidth', 2)
order fit', '4th.
resulting fits are displayed in this figure
The coefficients found
by Matlab are:
p2 = -0.3863
p4 = 0.0334
p5 = 0.0213
in other words, represent the following polynomials of 2nd.,
4th. and 5th. order, respectively.
= -0.3863x2 + 6.3983x - 4.1596
+ 5.0158x2 - 8.4137x + 7.5779
+ 4.1330x3 - 14.5708x2 +
25.3233x - 11.3510
You can see this video on
how to do it even easier with the integrated Curve Fitting Tool
within the Figure window.
'Polynomial Regression' to home
From 'Polynomial Regression' to