The calculator below uses Linear Least Squares method for curve fitting, in other words, to approximate one variable function using regression analysis, just like the calculator Function approximation with regression analysis. But, unlike previous calculator, this one can find an approximating function if it is additionally constrained by particular points, which means that computed curve-fit should pass through these particular points.
Lagrange multipliers are used to find a curve-fit in case of constraints. This poses some limitations to used regression model, namely, only linear regression models can be used. That's why, unlike the above-mentioned calculator, this one does not include power and exponential regressions. However, it includes 4th and 5th order polynomial regressions. Formulas and some theory recap can be found below the calculator, as usual.
Note that if x-values field left empty, calculator assumes that x changes starting from zero with +1 increment.
Linear least squares (LLS)
Linear least squares (LLS) is the least squares approximation of linear functions to data. And the method of least squares is a standard approach in regression analysis to approximate the solution of overdetermined systems (sets of equations in which there are more equations than unknowns) by minimizing the sum of the squares of the residuals made in the results of every single equation.
You can find more information, including formulas, about the least squares approximation at Function approximation with regression analysis.
Here we will talk with linear regression models, then approximating function is the linear combination of parameters which should be determined. Determined values, of course, should minimizing the sum of the squares of the residuals.
Suppose we have a set of data points .
Our approximating function is the linear combination of parameters to be determined, for example
We can use matrix notation to express the values of this function
Or, in short notation:
Since we are using least squares approximation, we should minimize the following function
or, in matrix form
This value is the distance between vector y and vector Xa. To minimize this distance, Xa should be the projection to X columns space and vector Xa-y should be orthogonal to that space.
This is possible then
there v - is random vector in columns space. Since it could random, the only way to satisfy the condition above is to have
The calculator uses the formula above in case of unconstrained linear least squares method.
Now let's talk about constraints. These could be:
- curve-fit must pass through particular points (this is supported by the calculator)
- slope of the curve at particular points must be equal to particular values.
So, we need to find the approximating function, which, from one side, should minimize the sum of the squares,
and from the other side, should satisfy the conditions
or, in matrix form,
This is called the conditional extremum, and it is solved by constructing the Lagrangian using Lagrange multipliers.
In our case the Lagrangian is
and the task is to find its extremum. After some derivations, which I would not list here, the formula to find the parameters is
The calculator uses the formula above in case of constrained linear least squares method