Last week we saw how we can use MATLAB to set up and solve the normal equations to perform a linear leastsquares regression on a set of discrete data. Since these data sets can often be quite large, computational software, such as MATLAB, provides a welcome alternative to setting up and solving these equations by hand, which requires multiple steps of arithmetic, followed by solving a system of equations. So far, our regression techniques have been limited to identifying linear relationships – i.e. straight lines – within our data. However, we saw that there are many cases where these type of relationships do not adequately describe the underlying behavior present in the data. While we can still establish linear relationships between non-linear data, the resulting equations often result in substantial errors. In lecture this week we begin to see how we can apply slightly modified versions of our linear least-squares regression techniques establish polynomic and other nonlinear equations to describe data trends. The purpose of today’s lab is to show how these techniques can be implemented in MATLAB, and to highlight a few built-in functions that will be quite useful for quickly fitting both linear and nonlinear relationships to discrete data.
Least-squares linear regression allows us to fit data to equations of the form
Since and are just constants, we are going to use a slightly different notation for now, and rewrite our equation
Where now the goal of our regression is to identify the values of and that produce the best-fit line to our data.
Recalling from our early algebra classes that polynomials are mathematical expressions of the form
Get Free Quote!
264 Experts Online