# Orthogonal projections

## Everything You Need in One PlaceHomework problems? Exam preparation? Trying to grasp a concept or just brushing up the basics? Our extensive help & practice library have got you covered. | ## Learn and Practice With EaseOur proven video lessons ease you through problems quickly, and you get tonnes of friendly practice on questions that trip students up on tests and finals. | ## Instant and Unlimited HelpOur personalized learning platform enables you to instantly find the exact walkthrough to your specific type of question. Activate unlimited help now! |

#### Make math click 🤔 and get better grades! 💯Join for Free

##### Intros

###### Lessons

**Orthogonal Projections Overview:**__The Orthogonal Decomposition Theorem__

• Make $y$ as the sum of two vectors $\hat{y}$ and $z$

• Orthogonal basis → $\hat{y}= \frac{y \cdot v_1}{v_1 \cdot v_1}v_1 + \cdots + \frac{y \cdot v_p}{v_p \cdot v_p}v_p$

• Orthonormal basis → $\hat{y}=(y\cdot v_1)v_1+\cdots +(y\cdots v_p)v_p$

• $z=y - \hat{y}$__Property of Orthogonal Projections__

• proj$_s y=y$

• Only works if $y$ is in $S$__The Best Approximation Theorem__

• What is the point closest to $y$ in $S$? $\hat{y}$!

• Reason why: $\lVert y - \hat{y} \rVert$ < $\lVert y-u \rVert$

• The Distance between the $y$ and $\hat{y}$

##### Examples

###### Lessons

**The Orthogonal Decomposition Theorem**

Assume that {$v_1,v_2,v_3$} is an orthogonal basis for $\Bbb{R}^n$. Write $y$ as the sum of two vectors, one in Span{$v_1$}, and one in Span{$v_2,v_3$}. You are given that:

- Verify that {$v_1,v_2$} is an orthonormal set, and then find the orthogonal projection of $y$ onto Span{$v_1,v_2$}.

**Best Approximation**

Find the best approximation of $y$ by vectors of the form $c_1 v_1+c_2 v_2$, where , and , .**Finding the Closest Point and Distance**

Find the closest point to $y$ in the subspace $S$ spanned by $v_1$ and $v_2$.- Find the closest distance from $y$ to $S=$Span{$v_1,v_2$} if

###### Free to Join!

StudyPug is a learning help platform covering math and science from grade 4 all the way to second year university. Our video tutorials, unlimited practice problems, and step-by-step explanations provide you or your child with all the help you need to master concepts. On top of that, it's fun — with achievements, customizable avatars, and awards to keep you motivated.

#### Easily See Your Progress

We track the progress you've made on a topic so you know what you've done. From the course view you can easily see what topics have what and the progress you've made on them. Fill the rings to completely master that section or mouse over the icon to see more details.#### Make Use of Our Learning Aids

#### Earn Achievements as You Learn

Make the most of your time as you use StudyPug to help you achieve your goals. Earn fun little badges the more you watch, practice, and use our service.#### Create and Customize Your Avatar

Play with our fun little avatar builder to create and customize your own avatar on StudyPug. Choose your face, eye colour, hair colour and style, and background. Unlock more options the more you use StudyPug.

###### Topic Notes

## Orthogonal projections

#### What is an orthogonal projection

The orthogonal projection of a vector onto another is, just as the name says it, the projection of the first vector above the second one. You may be wondering what is that supposed to mean then?, well, for a better explanation let us show you graphically in the next figure:Notice that in the right hand side of the figure we can see how we added the vector which is the orthogonal projection of a onto b. The word orthogonal (which you know already that it means there is perpendicularity involved) comes from the angle made by the projection and the normal line connecting the projection and the original vector, for this case, this normal line (the dashed line in figure 1) is the component of vector a that is orthogonal to b, and can its length can be written as: $\parallel a-b \parallel$.

#### Orthogonal projection vector

A formal orthogonal projection definition would be that it refers to the projection of a vector onto a plane which is parallel to another vector, in other words and taking figure 1 in mind, the projection of vector a falls in the same plane as vector $b$, and so, the projection of vector $a$ is a vector parallel to vector $b$.

And so, if we consider a subspace spanned by the vector $v$, then the orthogonal projection of $y$ onto $v$ is defined as $\hat{y}$ and can be calculated with the next equation:

_{L}$y= \hat{y}=$ $\large \frac{y \, \cdot \, v}{v \, \cdot \, v}$ $v$

Where $\hat{y}$ is called the orthogonal projection vector, and so, equation 1 may be referred to (in general) as the orthogonal projection formula .

Notice the component of y orthogonal to v is equal to $z=y-\hat{y}$.

Now let us talk about orthogonal projections onto a subspace, not another vector, but a plane. For that:

Let $S$ be a subspace in $R^n$, then each vector $y$ in $R^n$ can be written as: $y= \hat{y}+z$

Where $\hat{y}$ is in $S$ and $z=y-\hat{y}$ is in $S^{\perp}$. And so, $\hat{y}$ is the orthogonal projection of $y$ onto $S$.

Therefore, if we want to calculate $\hat{y}$ we need to check if {$v_1, ... , v_p$} is an orthogonal basis of $S$, if the vectors from the subset happen to form an orthogonal set and orthogonal basis (which remember, this can be checked by performing the dot product of all of the vectors in the set) then we can calculate the projection of $y$ onto $S$ as:

_{S}$y = \hat{y} =$ $\large \frac{y \, \cdot \, v_1} {v_1 \, \cdot \, v_1}$ $v_1 \, + \,$ $\large \frac{y \, \cdot \, v_2} {v_2 \, \cdot \, v_2}$ $v_2 \, + \, ... \,$ $\large \frac{y \, \cdot \, v_p} {v_p \, \cdot \, v_p}$ $v_p$

However, if {$v_1, ... , v_p$} is an orthonormal basis of $S$, then the equation changes a little bit:

_{S}$y = \hat{y} = (y \cdot v_1)v_1 \, +\, (y \cdot v_2)v_2 \, +...+ \, (y \cdot v_p)v_p$

Remember that an orthonormal basis is a basis conformed by a set of vectors which are orthogonal with each other AND at the same time they are all unit vectors themselves.

#### How to find orthogonal projection

The steps to find the orthogonal projection of vector y onto a subspace are as follows:

- Verify that the set of vectors provided is either an orthogonal basis or an orthonormal basis
- If orthogonal basis continue on step 2
- If orthonormal basis continue on step 3
- Having an orthogonal basis containing a set of vectors {$v_1,... , v_p$}, compute the projection of $y$ onto $S$ by solving the formula found in equation 2. In order to do that, follow the next steps:
- Calculate the dot products $y \cdot v_1, ... , y \cdot v_p$
- Calculate the dot products $v_1 \cdot v_1 , ... , v_p \cdot v_p$
- Compute the divisions $\large \frac{y \, \cdot \, v_1} {v_1 \, \cdot \, v_1} , ... , \frac{y \, \cdot \, v_v} {v_v \, \cdot \, v_v}$
- Multiply each result from our last step with its corresponding vector {$v_1 ,... , v_p$}
- Add all of the resulting vectors together to find the final projection vector.
- Having an orthonormal basis containing a set of vectors {$v_1 ,... , v_n$}, compute the projection of $y$ onto $S$ by solving the formula found in equation 3. In order to do that, follow the next steps:
- Calculate the dot products $y \cdot v_1 , ... , y \cdot v_p$
- Multiply each result from our last step with its corresponding vector {$v_1 ,... , v_p$}
- Add all of the resulting vectors together to find the final projection vector.

And now you are ready to solve some exercise problems!

#### Orthogonal projection examples

__Example 1__

Assume that {$v_1, v_2, v_3$} is an orthogonal basis for $R^n$. Write $y$ as a sum of two vectors, one in Span{$v_1$} and one in Span{$v_2, v_3$}. The vectors $v_1, v_2, v_3$ and $y$ are defined as follows:
_{1}, v

_{2}, v

_{3}and y

For this first problem we are already assuming that the vectors provided form an orthogonal basis in $R^n$, that means each vector is orthogonal to each other, and linearly independent. Therefore the spans Span{$v_1$} and Span{$v_2,v_3$} each have basis {$v_1$} and {$v_2, v_3$} containing orthogonal vectors which makes them linearly independent, these characteristics make them orthogonal bases!.

Therefore, we can be sure already that we can use equation 2 in order to solve for $\hat{y}$ when needed, in BOTH cases: $y$ as a sum of two vectors in Span{$v_1$} and $y$ as a sum of two vectors in Span{$v_2, v_3$}.

If we need to write y as a sum of two vectors, remember from figures 2 and 3 that $y = \hat{y} + z$,

And so, we calculate $\hat{y}$ first and then add it to $z$.

We will work only on the first part of the problem writing $y$ as a sum of two vectors in Span{$v_1$} and leave the second case for you to solve on your own.

So let us calculate $\hat{y}$!

For this case we have only one vector in the basis of the span $S$, and so, the formula goes as:

With that we can now write y as a sum of two vectors in Span{$v_1$} as follows:

_{1}}

And what is $z$? Easy! We can calculate it just to see what it is:

__Example 2__

Verify that {$v_1, v_2$} is an orthonormal set, and then define orthogonal projection of $y$ onto Span {$v_1, v_2$}.
_{1}, v

_{2 }and y

To verify if the set {$v_1, v_2$} is orthonormal we first check if the vectors in the set are orthogonal to each other by computing their dot product:

_{1}and v

_{2}

Since the dot product yielded a result of zero, then it means the vectors are orthogonal to each other. The second condition for the set to be an orthonormal set is that its vectors are unit vectors, thus, let us check if their magnitude is one.

_{1}and v

_{2}

And so, we have an orthonormal set since we just proved that the vectors $v_1$ and $v_2$ are unit vectors. Now we have to find the orthogonal projection of y onto Span {$v_1, v_2$}.

_{S}$y = \hat{y} = (y \cdot v_1)v_1 \, + \,(y \cdot v_2)v_2$

Using this equation, we plug the values that we have for vectors $v_1, v_2$ and $y$ in order to calculate the projection vector $\hat{y}$:

_{1}, v

_{2}}

__Example 3__

Find the best approximation of $y$ by vectors of the form $c_1v_1 \, + \, c_2v_2$, where:
_{1}and v

_{2}

Having vectors of the form $c_1v_1 \, + \, c_2v_2$ means that we have a linear combination that is the same as having a span of vectors, in this case, the span of vectors $v_1$ and $v_2$. And so, we can obtain a basis $S$ such as:

_{1},v

_{2}

Now the first thing to do in order to find the best approximation of y is to check if the basis provided is an orthogonal basis, for that, we obtain the inner product of the two vectors inside the basis:

And given that the set is an orthogonal set due the inner product above resulting in a zero, we can now finally compute the vector $\hat{y}$ (which is the best approximation of $y$) by using the projection formula shown in equation 2 as shown below:

_{S}$y = \hat{y}$ $\large = \frac{y\, \cdot v_1}{v_1 \, \cdot v_1}$ $v_1$ $\large +\frac{y\, \cdot v_2}{v_2 \, \cdot v_2}$ $v_2$

__Example 4__

Find the closest point to $y$ in the subspace $S$ spanned by $v_1$ and $v_2$:
_{1}, v

_{2}and y

The closest point to $y$ is simply $\hat{y}$ since its the shortest distance compared to any other vector given that is the best approximation of $y$ itself. And so, the purpose of this problem is to calculate $\hat{y}$.

For that we will use equation 2 once more in order to calculate the orthogonal projection of y onto $S$, but the problem is that at this point we need more information for us to use that equation.

First we need to check if we have an orthogonal basis for $S$.

In here we have a subspace spanned by $v_1$ and $v_2$, which means that we have the linear combination of $v_1$ and $v_2$ (so $v_1$ and $v_2$ are linearly independent) which is equal to the basis $S=$ Span{$v_1 , v_2$}. In order to check if this basis $S$ is an orthogonal basis, we have to see if $v_1$ and $v_2$ are orthogonal to each other, therefore, we compute their dot product!

_{1}and v

_{2}

And so, we have an orthogonal basis since the dot product above yielded a result of zero.

So now we can calculate $\hat{y}$ using equation 2:

_{S}$y = \hat{y}$ $\large = \frac{y\, \cdot v_1}{v_1 \, \cdot v_1}$ $v_1$ $\large +\frac{y\, \cdot v_2}{v_2 \, \cdot v_2}$ $v_2$

__Example 5__

Find the closest distance from $y$ to $S=$ Span{$v_1, v_2$} if $v_1, v_2$ and $y$ are defined as below:
_{1}, v

_{2}and y

In this case, the closest distance from $y$ to $S$ can be graphically represented below:

Therefore the closest distance is equal to the magnitude of the subtraction of the vector $y$ and $y$, therefore the closest distance $= \enspace \parallel y -\hat{y}\parallel$

So if we want to calculate the closest distance, we need to compute $\hat{y}$ and for that, we first need to check that we have an orthogonal basis, and so, we check for an orthogonal basis by calculating the dot product of $v_1$ and $v_2$:

_{1}and v

_{2}

Knowing that we have an orthogonal basis due the result above, we can now compute $\hat{y}$:

And now that we have the vector $\hat{y}$ we can finally compute the length $= \enspace \parallel y -\hat{y}\parallel$:

_{1},v

_{2}}

And so, we have arrived to the end of our lesson, we hope you enjoyed it and see you in the next topic!

**The Orthogonal Decomposition Theorem**Let $S$ be a subspace in $\Bbb{R}^n$. Then each vector $y$ in $\Bbb{R}^n$ can be written as:

where $\hat{y}$ is in $S$ and $z$ is in $S^{\perp}$. Note that $\hat{y}$ is the orthogonal projection of $y$ onto $S$

If {$v_1,\cdots ,v_p$} is an orthogonal basis of $S$, then

However if {$v_1,\cdots ,v_p$} is an orthonormal basis of $S$, then

**Property of Orthogonal Projection**If {$v_1,\cdots ,v_p$} is an orthogonal basis for $S$ and if $y$ happens to be in $S$, then

In other words, if y is in $S=$Span{$v_1,\cdots ,v_p$}, then $proj_{S}y=y$.

**The Best Approximation Theorem**Let $S$ be a subspace of $\Bbb{R}^n$. Also, let $y$ be a vector in $\Bbb{R}^n$, and $\hat{y}$ be the orthogonal projection of $y$ onto $S$. Then $y$ is the closest point in $S$, because

where $u$ are all vectors in $S$ that are distinct from $\hat{y}$.

2

videos

remaining today

remaining today

5

practice questions

remaining today

remaining today