# Orthogonal sets #### Everything You Need in One Place

Homework problems? Exam preparation? Trying to grasp a concept or just brushing up the basics? Our extensive help & practice library have got you covered. #### Learn and Practice With Ease

Our proven video lessons ease you through problems quickly, and you get tonnes of friendly practice on questions that trip students up on tests and finals. #### Instant and Unlimited Help

Our personalized learning platform enables you to instantly find the exact walkthrough to your specific type of question. Activate unlimited help now! ##### Intros
###### Lessons
1. Orthogonal Sets Overview:
2. Orthogonal Sets and Basis
• Each pair of vector is orthogonal
• Linear independent → Form a Basis
• Calculate weights with Formula
3. Orthonormal Sets and Basis
• Is an orthogonal set
• Each vector is a unit vector
• Linear independent → Form a Basis
4. Matrix $U$ with Orthonormal columns and Properties
$U^T U=I$
• 3 Properties of Matrix $U$
5. Orthogonal Projection and Component
• Orthogonal Projection of $y$ onto $v$
• The component of $y$ orthogonal to $v$
##### Examples
###### Lessons
1. Orthogonal Sets and Basis
Is this an orthogonal set? 1. Verify that is an orthogonal basis for $\Bbb{R}^2$, and then express as a linear combination of the set of vectors in $B$.
1. Orthonormal Sets/Basis
Is set $B$ is an orthonormal basis for $\Bbb{R}^3$? 1. Let where $U$ has orthonormal columns and $U^TU=I$. Verify that
$(Ux)\cdot (Uy)=x\cdot y$
1. Orthogonal Projection
Let and . Write $y$ as the sum of two orthogonal vectors, one in Span{$v$} and one orthogonal to $v$.

## Orthogonal sets

#### What does orthogonal mean?

Remember from our past lesson on the inner product, length and orthogonality that we defined two vectors as orthogonal if they were perpendicular to each other (in general, any orthogonal lines happen to be lines which are perpendicular, and so, orthogonal vs perpendicular is just the same). For linear algebra we focus on the concept in the sense that we are now not only thinking in two dimensions (thus orthogonal lines) but we want to expand our view into three dimensions (due to the scenarios which can be encountered in our world given that we live in a tridimensional space), and so we start using the word orthogonal meaning the perpendicularity of not only lines but planes and spaces, opening the field for the mathematical calculations we can compute and the cases that can be studied.
In the next diagram, you can see an example of orthogonal vectors: Mathematically speaking, orthogonal means that when you compute the inner product of two vectors, the result must be zero for the vectors to be orthogonal. For example, if any set of two vectors each lying in a different plane (without being an axis for the panes) of the orthogonal planes shown below, will be automatically orthogonal since their components can be observed to be orthogonal. Before we continue, let us remind you that the inner product of two vectors is the same as the dot product, or scalar product between two vectors, and thus, we will be using these terms interchangeably during the lesson.

#### How to find orthogonal vector set

So, answer quickly: how to tell if two vectors are orthogonal to each other? The dot product! Keep this fresh in your head!
To formally define orthogonal sets with this in mind: A set of vectors {$v_1, v_2, ... ,v_n$} in $R^n$ are orthogonal sets if each pair of vectors from the set are orthogonal to each other. And this means that the inner product (dot product) of the vectors within the set is equal to zero as:

$v_i \cdot v_j= 0$

Note that for equation 1, the subindexes $i$ and $j$ cannot be equal, so this equation works when $i$$j$ since them being equal would mean that we are multiplying the vector with itself and would result in the magnitude of the vector squared.

With this in mind, if the set of vectors {$v_1, v_2, ... ,v_n \;$} in $\, R^n \,$ is an orthogonal set, then this is enough to prove linear independence for these vectors. Thus, the vectors form a basis for a subspace S, and this basis is what we call the orthogonal basis.

To check if a set is an orthogonal basis in $\, R^n \,$, we can simply verify if this is an orthogonal set.
Remember from our lesson about the linear combination and vector equations $R^n$ that a linear combination of vectors $v_1, v_2, ... ,v_p \,$ is defined as:

$x= c_1 v_1 + c_2 v _2 + ... +c_p v_p$

Where $i=$1,2,$... , p \,$ and the constants $c_1, c_2, ... ,c_p$ can be calculated using the formula:

$\large C_i = \frac{X \; \cdot \; V_i}{V_i \; \cdot \; V_i}$

Since it is said that an orthogonal set of vectors is a set where all the vectors are linearly independent, we can use this formula for a linear combination to solve for the vector $x$ defined in equation 2. Therefore, for finding orthogonal vectors within a set, being able to define a set as orthogonal and find the orthogonal basis, we need to follow the next steps:
1. Determine if the vectors given in the set are orthogonal by computing the dot products of them by pairs.
2. If the result of each pair of vectors dot product was zero, then the vector set is an orthogonal set.
3. If the set is orthogonal, then is forms and orthogonal basis.
1. We can prove this by writing the vector of the linear combination containing the vectors from the set.

We will take a look at a few examples of such operations in our next section. For exercise example 1 we will need to follow only steps 1 and 2; while for exercise example 2 we will need to follow all three steps, it just depends on whatever you are asked but the process itself is pretty straightforward.

Now, what happens if you are asked for an orthonormal basis? Notice: NOT orthogonal.
Well, a set of vectors {$v_1, v_2, ... ,v_n \;$} is an orthonormal set if is an orthogonal set of unit vectors, in other words, all of the vectors within the set must be orthogonal to each other, plus, they must all have a magnitude (length) of one. Having a subspace $S$ spanned by such orthonormal set we say that the vectors inside of it form an orthonormal basis, and this is easily understood because we already know that all of the vectors inside the set are linearly independent of each other since this is a condition for an orthogonal set.

Following with the definition of orthonormal, we can check if the column vectors belonging to the columns of a matrix are orthonormal by performing the next matrix multiplication:

$\large U^TU = I$

Where $U\,$ is a $\,m \times n \,$ matrix with columns that are orthonormal vectors. And $I$ is the identity matrix.
With equation 4 in mind, if we have this $\,m \times n \,$ matrix $U$ containing orthonormal columns and a pair of vectors $x\,$ and $\, y\;$ in $\, R^n \,$, then we have the following three equation identities to be true:

$\parallel Ux \parallel \;= \; \parallel x \parallel$

$(Ux) \cdot (Uy) = x \cdot y$

$(Ux) \cdot (Uy) = 0 \;$if and only if $\, x \cdot y = 0$

Now that you know what is orthogonal (and non orthogonal for that matter), how to tell if two vectors are orthogonal and even know how to find orthogonal vectors and sets of them, it is time we work through a few examples.

#### Finding orthogonal vectors

During the next four problems we will use our knowledge on the orthogonal definition applied to vectors, sets, basis and then how it develops into orthonormality. So have fun solving the exercises!

#### Example 1

Having a set of three vectors as shown below: Is this an orthogonal set?

In order for us to prove if the set is orthogonal, we have to determine whether the given vectors are orthogonal, parallel, or neither. And how do we do that? With the dot product! Remember that the dot product of orthogonal vectors is equal to zero, therefore, we just have to obtain the dot product of all of these vectors with each other and see the results.   Since all of the dot product operations resulted in zeros, then it means that the three vectors in the set are all orthogonal to each other, and therefore, this is an orthogonal set of vectors.

#### Example 2

Given the set $B$ and the column vector $x$ as defined below: Verify that $B$ is an orthogonal basis for $R^2$, and then express $x$ as a linear combination of the set of vectors in $B$.
So we start by verifying the orthogonality of $v_1$ and $v_2$, the two vectors in the set called $B$; for that, we get the dot product $v_1 \cdot v_2$: A result of zero for the dot product tells us that the set $B$ is a set of orthogonal vectors.
Then, our next step is to express $x$ as a linear combination of the set of vectors in $B$. Using equation 2 we write the linear combination of the vectors from set $B$ and express it as vector $x$: All that is left to complete this expression is to find the values of the constants $c_1, c_2$. Using equation 3 we have that:

$\large C_1 = \frac{X \cdot V_1} {V_1 \cdot V_1} = \frac{(-3)(1) + (3)(3)}{(1)(1) + (3)(3)} = \frac{-3+9}{1+9} = \frac{6}{10} = \frac{3}{5}$

$\large C_2 = \frac{X \cdot V_2} {V_2 \cdot V_2} = \frac{(-3)(-9) + (3)(3)}{(-9)(-9) + (3)(3)} = \frac{27+9}{81+9} = \frac{36}{90} = \frac{2}{5}$

And so, we rewrite the linear combination including the found values of $c1\,$ and $\, c_2$: #### Example 3

Having set $B$ of vectors (as defined below), is set $B$ an orthonormal basis for $R^3$? In order to prove that this is an orthonormal basis for $R^3$, we need to prove this is an orthonormal set. Remember that a set of orthonormal vectors is that which is composed of vectors which are orthogonal to each other and at the same time, they are all unit vectors. So, let us prove the orthogonality of the vectors in the set first:   #### Example 4

Let $U$, $x \,$ and $\,y$ be as defined below, where $U$ has orthonormal columns and $U^TU=I$. Now, verify that:

$(Ux) \cdot (Uy)= x \cdot y$

We will solve this problem by parts, the left hand side and the right hand side. Since the right hand side will be shorter to solve, lets start by doing that dot product first:

$x \cdot y=(3)(6)+(\sqrt{2})(2\sqrt{2})=18+4=22$

Now working through the right hand side of the equation to verify: $(Ux) \cdot (Uy)=(-1)(-2)+(-1)(-2)+(3)(6)=2+2+18=22$

Notice the results from both sides of equation 15 yield the value of 22, and therefore, the expression is proved correct.

***

Now that you have learnt about orthogonality and vector sets with this characteristic, we recommend you to visit this lesson on orthogonal sets which includes a variety or detailed problems that can supplement your studies.
In our next lesson we will introduce the concept of orthogonal projection, which of course is closely tied to the definitions we saw during this lesson. So we hope that you have enjoyed our lesson of today, and this is it, see you in the next one!
A set of vectors {$v_1,\cdots,v_n$} in $\Bbb{R}^n$ are orthogonal sets if each pair of vectors from the set are orthogonal. In other words,
$v_i \cdot v_j =0$
Where $i \neq j$.

If the set of vectors {$v_1,\cdots,v_n$} in $\Bbb{R}^n$ is an orthogonal set, then the vectors are linearly independent. Thus, the vectors form a basis for a subspace $S$. We call this the orthogonal basis.

To check if a set is an orthogonal basis in $\Bbb{R}^n$, simply verify if it is an orthogonal set.
$y=c_1 v_1+c_2 v_2+\cdots+c_p v_p$

Are calculated by using the formula:
$c_i = \frac{y \cdot v_i}{v_i \cdot v_i}$
where $i=1,\cdots,p$.

A set {$v_1,\cdots,v_p$}is an orthonormal set if it's an orthogonal set of unit vectors.

If $S$ is a subspace spanned by this set, then we say that {$v_1,\cdots,v_p$} is an orthonormal basis. This is because each of the vectors are already linear independent.

A $m \times n$ matrix $U$ has orthonormal columns if and only if $U^T U=I$.

Let $U$ be an $m \times n$ matrix with orthonormal columns, and let $x$ and $y$ be in $\Bbb{R}^n$. Then the 3 following things are true:
1) $\lVert Ux \rVert = \lVert x \rVert$
2) $(Ux) \cdot (Uy)=x \cdot y$
3) $(Ux) \cdot (Uy)=0$ if and only if $x \cdot y =0$

Consider $L$ to be the subspace spanned by the vector $v$ . Then the orthogonal projection of $y$ onto $v$ is calculated to be:
$\hat{y}=$proj$_Ly=\frac{y \cdot v}{v \cdot v}v$

The component of $y$ orthogonal to $v$ (denoted as $z$) would be:
$z=y-\hat{y}$