Least-squares problem - Orthogonality and Least Squares
Least-squares problem
Lessons
Notes:
In linear algebra, we have dealt with questions in which does not have a solution. When a solution does not exist, the best thing we can do is to approximate . In this section, we will learn how to find a such that it makes as close as possible to .
If is an matrix and is a vector in , then a least-squares solution of is a in where
For all in .
The smaller the distance, the smaller the error. Thus, the better the approximation. So the smallest distance gives the best approximation for . So we call the best approximation for to be .
The Least-Squares Solution
The set of least-square solutions of matches with the non-empty set of solutions of the matrix equation .
In other words,
→
Where is the least square solutions of .
Keep in mind that is not always a unique solution. However, it is unique if one of the conditions hold:
1. The equation has unique least-squares solution for each b in .
2. The columns of are linearly independent.
3. The matrix is invertible.
The Least-Squares Error
To find the least-squares error of the least-squares solution of , we compute
Alternative Calculations to Least-Squares Solutions
Let be a matrix where are the columns of . If {} form an orthogonal set, then we can find the least-squares solutions using the equation
where
Let be a matrix with linearly independent columns, and let be the factorization of . Then for each in , the equation has a unique least-squares solution where
→
→
→
-
Intro Lesson
Least Squares Problem Overview:
