# Orthogonal projections #### Everything You Need in One Place

Homework problems? Exam preparation? Trying to grasp a concept or just brushing up the basics? Our extensive help & practice library have got you covered. #### Learn and Practice With Ease

Our proven video lessons ease you through problems quickly, and you get tonnes of friendly practice on questions that trip students up on tests and finals. #### Instant and Unlimited Help

Our personalized learning platform enables you to instantly find the exact walkthrough to your specific type of question. Activate unlimited help now! ##### Intros
###### Lessons
1. Orthogonal Projections Overview:
2. The Orthogonal Decomposition Theorem
• Make $y$ as the sum of two vectors $\hat{y}$ and $z$
• Orthogonal basis → $\hat{y}= \frac{y \cdot v_1}{v_1 \cdot v_1}v_1 + \cdots + \frac{y \cdot v_p}{v_p \cdot v_p}v_p$
• Orthonormal basis → $\hat{y}=(y\cdot v_1)v_1+\cdots +(y\cdots v_p)v_p$
$z=y - \hat{y}$
3. Property of Orthogonal Projections
• proj$_s y=y$
• Only works if $y$ is in $S$
4. The Best Approximation Theorem
• What is the point closest to $y$ in $S$? $\hat{y}$!
• Reason why: $\lVert y - \hat{y} \rVert$ < $\lVert y-u \rVert$
• The Distance between the $y$ and $\hat{y}$
##### Examples
###### Lessons
1. The Orthogonal Decomposition Theorem
Assume that {$v_1,v_2,v_3$} is an orthogonal basis for $\Bbb{R}^n$. Write $y$ as the sum of two vectors, one in Span{$v_1$}, and one in Span{$v_2,v_3$}. You are given that: 1. Verify that {$v_1,v_2$} is an orthonormal set, and then find the orthogonal projection of $y$ onto Span{$v_1,v_2$}. 1. Best Approximation
Find the best approximation of $y$ by vectors of the form $c_1 v_1+c_2 v_2$, where , and , .
1. Finding the Closest Point and Distance
Find the closest point to $y$ in the subspace $S$ spanned by $v_1$ and $v_2$. 1. Find the closest distance from $y$ to $S=$Span{$v_1,v_2$} if ## Orthogonal projections

#### What is an orthogonal projection

The orthogonal projection of a vector onto another is, just as the name says it, the projection of the first vector above the second one. You may be wondering what is that supposed to mean then?, well, for a better explanation let us show you graphically in the next figure: Notice that in the right hand side of the figure we can see how we added the vector which is the orthogonal projection of a onto b. The word orthogonal (which you know already that it means there is perpendicularity involved) comes from the angle made by the projection and the normal line connecting the projection and the original vector, for this case, this normal line (the dashed line in figure 1) is the component of vector a that is orthogonal to b, and can its length can be written as: $\parallel a-b \parallel$.

#### Orthogonal projection vector

A formal orthogonal projection definition would be that it refers to the projection of a vector onto a plane which is parallel to another vector, in other words and taking figure 1 in mind, the projection of vector a falls in the same plane as vector $b$, and so, the projection of vector $a$ is a vector parallel to vector $b$.

And so, if we consider a subspace spanned by the vector $v$, then the orthogonal projection of $y$ onto $v$ is defined as $\hat{y}$ and can be calculated with the next equation:

$proj$L$y= \hat{y}=$ $\large \frac{y \, \cdot \, v}{v \, \cdot \, v}$ $v$

Where $\hat{y}$ is called the orthogonal projection vector, and so, equation 1 may be referred to (in general) as the orthogonal projection formula . Notice the component of y orthogonal to v is equal to $z=y-\hat{y}$.

Now let us talk about orthogonal projections onto a subspace, not another vector, but a plane. For that: Let $S$ be a subspace in $R^n$, then each vector $y$ in $R^n$ can be written as: $y= \hat{y}+z$

Where $\hat{y}$ is in $S$ and $z=y-\hat{y}$ is in $S^{\perp}$. And so, $\hat{y}$ is the orthogonal projection of $y$ onto $S$.
Therefore, if we want to calculate $\hat{y}$ we need to check if {$v_1, ... , v_p$} is an orthogonal basis of $S$, if the vectors from the subset happen to form an orthogonal set and orthogonal basis (which remember, this can be checked by performing the dot product of all of the vectors in the set) then we can calculate the projection of $y$ onto $S$ as:

$proj$S$y = \hat{y} =$ $\large \frac{y \, \cdot \, v_1} {v_1 \, \cdot \, v_1}$ $v_1 \, + \,$ $\large \frac{y \, \cdot \, v_2} {v_2 \, \cdot \, v_2}$ $v_2 \, + \, ... \,$ $\large \frac{y \, \cdot \, v_p} {v_p \, \cdot \, v_p}$ $v_p$

However, if {$v_1, ... , v_p$} is an orthonormal basis of $S$, then the equation changes a little bit:

$proj$S$y = \hat{y} = (y \cdot v_1)v_1 \, +\, (y \cdot v_2)v_2 \, +...+ \, (y \cdot v_p)v_p$

Remember that an orthonormal basis is a basis conformed by a set of vectors which are orthogonal with each other AND at the same time they are all unit vectors themselves.

#### How to find orthogonal projection

The steps to find the orthogonal projection of vector y onto a subspace are as follows:
1. Verify that the set of vectors provided is either an orthogonal basis or an orthonormal basis
1. If orthogonal basis continue on step 2
2. If orthonormal basis continue on step 3

2. Having an orthogonal basis containing a set of vectors {$v_1,... , v_p$}, compute the projection of $y$ onto $S$ by solving the formula found in equation 2. In order to do that, follow the next steps:
1. Calculate the dot products $y \cdot v_1, ... , y \cdot v_p$
2. Calculate the dot products $v_1 \cdot v_1 , ... , v_p \cdot v_p$
3. Compute the divisions $\large \frac{y \, \cdot \, v_1} {v_1 \, \cdot \, v_1} , ... , \frac{y \, \cdot \, v_v} {v_v \, \cdot \, v_v}$
4. Multiply each result from our last step with its corresponding vector {$v_1 ,... , v_p$}
5. Add all of the resulting vectors together to find the final projection vector.

3. Having an orthonormal basis containing a set of vectors {$v_1 ,... , v_n$}, compute the projection of $y$ onto $S$ by solving the formula found in equation 3. In order to do that, follow the next steps:
1. Calculate the dot products $y \cdot v_1 , ... , y \cdot v_p$
2. Multiply each result from our last step with its corresponding vector {$v_1 ,... , v_p$}
3. Add all of the resulting vectors together to find the final projection vector.

And now you are ready to solve some exercise problems!

#### Example 1

Assume that {$v_1, v_2, v_3$} is an orthogonal basis for $R^n$. Write $y$ as a sum of two vectors, one in Span{$v_1$} and one in Span{$v_2, v_3$}. The vectors $v_1, v_2, v_3$ and $y$ are defined as follows: For this first problem we are already assuming that the vectors provided form an orthogonal basis in $R^n$, that means each vector is orthogonal to each other, and linearly independent. Therefore the spans Span{$v_1$} and Span{$v_2,v_3$} each have basis {$v_1$} and {$v_2, v_3$} containing orthogonal vectors which makes them linearly independent, these characteristics make them orthogonal bases!.
Therefore, we can be sure already that we can use equation 2 in order to solve for $\hat{y}$ when needed, in BOTH cases: $y$ as a sum of two vectors in Span{$v_1$} and $y$ as a sum of two vectors in Span{$v_2, v_3$}.

If we need to write y as a sum of two vectors, remember from figures 2 and 3 that $y = \hat{y} + z$,
And so, we calculate $\hat{y}$ first and then add it to $z$.
We will work only on the first part of the problem writing $y$ as a sum of two vectors in Span{$v_1$} and leave the second case for you to solve on your own.

So let us calculate $\hat{y}$!
For this case we have only one vector in the basis of the span $S$, and so, the formula goes as:

$\large \hat{y} = \frac{y \cdot v_1}{v_1 \cdot v_1}v_1 = \frac{(4)(1)+(3)(2)+(4)(2)}{(1)(1)+(2)(2)+(2)(2)}v_1 = \frac{4+6+8}{1+4+4}v_1 = \frac{18}{9}v_1= 2v_1$ With that we can now write y as a sum of two vectors in Span{$v_1$} as follows: And what is $z$? Easy! We can calculate it just to see what it is: #### Example 2

Verify that {$v_1, v_2$} is an orthonormal set, and then define orthogonal projection of $y$ onto Span {$v_1, v_2$}. To verify if the set {$v_1, v_2$} is orthonormal we first check if the vectors in the set are orthogonal to each other by computing their dot product: Since the dot product yielded a result of zero, then it means the vectors are orthogonal to each other. The second condition for the set to be an orthonormal set is that its vectors are unit vectors, thus, let us check if their magnitude is one.

$\large\parallel\; v_1\parallel \enspace = \enspace \sqrt{(\frac{1}{\sqrt{2}})^2+(0)^2+(\frac{1}{\sqrt{2}})^2 } = \sqrt{ \frac{1}{2}+ \frac{1}{2}} =\sqrt{1} =1$

$\large\parallel\; v_2\parallel \enspace = \enspace \sqrt{(-\frac{1}{\sqrt{3}})^2+(\frac{1}{\sqrt{3}})^2+(\frac{1}{\sqrt{3}})^2 } = \sqrt{ \frac{1}{3}+ \frac{1}{3}+\frac{1}{3}} =\sqrt{1} =1$

And so, we have an orthonormal set since we just proved that the vectors $v_1$ and $v_2$ are unit vectors. Now we have to find the orthogonal projection of y onto Span {$v_1, v_2$}.

$proj$S$y = \hat{y} = (y \cdot v_1)v_1 \, + \,(y \cdot v_2)v_2$

Using this equation, we plug the values that we have for vectors $v_1, v_2$ and $y$ in order to calculate the projection vector $\hat{y}$:    #### Example 3

Find the best approximation of $y$ by vectors of the form $c_1v_1 \, + \, c_2v_2$, where: Having vectors of the form $c_1v_1 \, + \, c_2v_2$ means that we have a linear combination that is the same as having a span of vectors, in this case, the span of vectors $v_1$ and $v_2$. And so, we can obtain a basis $S$ such as:

$S=Span${$v_1,v_2$} $= c_1 v_1 \,+ \, c_2v_2$

Now the first thing to do in order to find the best approximation of y is to check if the basis provided is an orthogonal basis, for that, we obtain the inner product of the two vectors inside the basis: And given that the set is an orthogonal set due the inner product above resulting in a zero, we can now finally compute the vector $\hat{y}$ (which is the best approximation of $y$) by using the projection formula shown in equation 2 as shown below:

$proj$S$y = \hat{y}$ $\large = \frac{y\, \cdot v_1}{v_1 \, \cdot v_1}$ $v_1$ $\large +\frac{y\, \cdot v_2}{v_2 \, \cdot v_2}$ $v_2$  #### Example 4

Find the closest point to $y$ in the subspace $S$ spanned by $v_1$ and $v_2$: The closest point to $y$ is simply $\hat{y}$ since its the shortest distance compared to any other vector given that is the best approximation of $y$ itself. And so, the purpose of this problem is to calculate $\hat{y}$.
For that we will use equation 2 once more in order to calculate the orthogonal projection of y onto $S$, but the problem is that at this point we need more information for us to use that equation.

First we need to check if we have an orthogonal basis for $S$.
In here we have a subspace spanned by $v_1$ and $v_2$, which means that we have the linear combination of $v_1$ and $v_2$ (so $v_1$ and $v_2$ are linearly independent) which is equal to the basis $S=$ Span{$v_1 , v_2$}. In order to check if this basis $S$ is an orthogonal basis, we have to see if $v_1$ and $v_2$ are orthogonal to each other, therefore, we compute their dot product!

$v_1 \, \cdot \, v_2=(7)(1)+(-1)(-1)+(-4)(2)=7+1-8=0$

And so, we have an orthogonal basis since the dot product above yielded a result of zero.
So now we can calculate $\hat{y}$ using equation 2:

$proj$S$y = \hat{y}$ $\large = \frac{y\, \cdot v_1}{v_1 \, \cdot v_1}$ $v_1$ $\large +\frac{y\, \cdot v_2}{v_2 \, \cdot v_2}$ $v_2$   #### Example 5

Find the closest distance from $y$ to $S=$ Span{$v_1, v_2$} if $v_1, v_2$ and $y$ are defined as below: In this case, the closest distance from $y$ to $S$ can be graphically represented below: Therefore the closest distance is equal to the magnitude of the subtraction of the vector $y$ and $y$, therefore the closest distance $= \enspace \parallel y -\hat{y}\parallel$
So if we want to calculate the closest distance, we need to compute $\hat{y}$ and for that, we first need to check that we have an orthogonal basis, and so, we check for an orthogonal basis by calculating the dot product of $v_1$ and $v_2$:

$v_1 \cdot v_2 = (-1)(1)+(1)(1)+(0)(0)=-1+1=0$

Knowing that we have an orthogonal basis due the result above, we can now compute $\hat{y}$:  And now that we have the vector $\hat{y}$ we can finally compute the length $= \enspace \parallel y -\hat{y}\parallel$: ***

And so, we have arrived to the end of our lesson, we hope you enjoyed it and see you in the next topic!
The Orthogonal Decomposition Theorem
Let $S$ be a subspace in $\Bbb{R}^n$. Then each vector $y$ in $\Bbb{R}^n$ can be written as:

$y=\hat{y}+z$

where $\hat{y}$ is in $S$ and $z$ is in $S^{\perp}$. Note that $\hat{y}$ is the orthogonal projection of $y$ onto $S$

If {$v_1,\cdots ,v_p$} is an orthogonal basis of $S$, then

$proj_{S}y=\hat{y}=\frac{y \cdot v_1}{v_1 \cdot v_1}v_1 + \frac{y \cdot v_2}{v_2 \cdot v_2}v_2 + \cdots + \frac{y \cdot v_p}{v_p \cdot v_p}v_p$

However if {$v_1,\cdots ,v_p$} is an orthonormal basis of $S$, then

$proj_{S}y=\hat{y}=(y \cdot v_1)v_1+(y \cdot v_2)v_2 + \cdots + (y \cdot v_p)v_p$

Property of Orthogonal Projection
If {$v_1,\cdots ,v_p$} is an orthogonal basis for $S$ and if $y$ happens to be in $S$, then
$proj_{S}y=y$

In other words, if y is in $S=$Span{$v_1,\cdots ,v_p$}, then $proj_{S}y=y$.

The Best Approximation Theorem
Let $S$ be a subspace of $\Bbb{R}^n$. Also, let $y$ be a vector in $\Bbb{R}^n$, and $\hat{y}$ be the orthogonal projection of $y$ onto $S$. Then $y$ is the closest point in $S$, because

$\lVert y- \hat{y} \rVert$ < $\lVert y-u \rVert$

where $u$ are all vectors in $S$ that are distinct from $\hat{y}$.