Orthogonal sets

Everything You Need in One Place

Homework problems? Exam preparation? Trying to grasp a concept or just brushing up the basics? Our extensive help & practice library have got you covered.

Learn and Practice With Ease

Our proven video lessons ease you through problems quickly, and you get tonnes of friendly practice on questions that trip students up on tests and finals.

Instant and Unlimited Help

Our personalized learning platform enables you to instantly find the exact walkthrough to your specific type of question. Activate unlimited help now!

0/4
?
Intros
Lessons
  1. Orthogonal Sets Overview:
  2. Orthogonal Sets and Basis
    • Each pair of vector is orthogonal
    • Linear independent → Form a Basis
    • Calculate weights with Formula
  3. Orthonormal Sets and Basis
    • Is an orthogonal set
    • Each vector is a unit vector
    • Linear independent → Form a Basis
  4. Matrix UU with Orthonormal columns and Properties
    UTU=IU^T U=I
    • 3 Properties of Matrix UU
  5. Orthogonal Projection and Component
    • Orthogonal Projection of yy onto vv
    • The component of yy orthogonal to vv
0/5
?
Examples
Lessons
  1. Orthogonal Sets and Basis
    Is this an orthogonal set?
    Is this an orthogonal set
    1. Verify that Verify that this is an orthogonal basis for R^2 is an orthogonal basis for R2\Bbb{R}^2, and then express express it as a linear combination of the set of vectors in B as a linear combination of the set of vectors in BB.
      1. Orthonormal Sets/Basis
        Is set BB is an orthonormal basis for R3\Bbb{R}^3?
        Is set B is an orthonormal basis for R^3?
        1. Let Matrix U, x, and y where UU has orthonormal columns and UTU=IU^TU=I. Verify that
          (Ux)(Uy)=xy(Ux)\cdot (Uy)=x\cdot y
          1. Orthogonal Projection
            Let vector y and vector v. Write yy as the sum of two orthogonal vectors, one in Span{vv} and one orthogonal to vv.
            Topic Notes
            ?

            Orthogonal sets


            What does orthogonal mean?


            Remember from our past lesson on the inner product, length and orthogonality that we defined two vectors as orthogonal if they were perpendicular to each other (in general, any orthogonal lines happen to be lines which are perpendicular, and so, orthogonal vs perpendicular is just the same). For linear algebra we focus on the concept in the sense that we are now not only thinking in two dimensions (thus orthogonal lines) but we want to expand our view into three dimensions (due to the scenarios which can be encountered in our world given that we live in a tridimensional space), and so we start using the word orthogonal meaning the perpendicularity of not only lines but planes and spaces, opening the field for the mathematical calculations we can compute and the cases that can be studied.
            In the next diagram, you can see an example of orthogonal vectors:

            Orthogonal sets
            Figure 1: Orthogonal vectors

            Mathematically speaking, orthogonal means that when you compute the inner product of two vectors, the result must be zero for the vectors to be orthogonal. For example, if any set of two vectors each lying in a different plane (without being an axis for the panes) of the orthogonal planes shown below, will be automatically orthogonal since their components can be observed to be orthogonal.

            Orthogonal sets
            Figure 2: Orthogonal planes

            Before we continue, let us remind you that the inner product of two vectors is the same as the dot product, or scalar product between two vectors, and thus, we will be using these terms interchangeably during the lesson.

            How to find orthogonal vector set


            So, answer quickly: how to tell if two vectors are orthogonal to each other? The dot product! Keep this fresh in your head!
            To formally define orthogonal sets with this in mind: A set of vectors {v1,v2,...,vnv_1, v_2, ... ,v_n} in RnR^n are orthogonal sets if each pair of vectors from the set are orthogonal to each other. And this means that the inner product (dot product) of the vectors within the set is equal to zero as:

            vivj=0 v_i \cdot v_j= 0
            Equation 1: Dot product of two orthogonal vectors

            Note that for equation 1, the subindexes ii and jj cannot be equal, so this equation works when iijj since them being equal would mean that we are multiplying the vector with itself and would result in the magnitude of the vector squared.

            With this in mind, if the set of vectors {v1,v2,...,vn  v_1, v_2, ... ,v_n \; } in Rn \, R^n \, is an orthogonal set, then this is enough to prove linear independence for these vectors. Thus, the vectors form a basis for a subspace S, and this basis is what we call the orthogonal basis.

            To check if a set is an orthogonal basis in Rn \, R^n \, , we can simply verify if this is an orthogonal set.
            Remember from our lesson about the linear combination and vector equations RnR^n that a linear combination of vectors v1,v2,...,vpv_1, v_2, ... ,v_p \, is defined as:

            x=c1v1+c2v2+...+cpvpx= c_1 v_1 + c_2 v _2 + ... +c_p v_p
            Equation 2: Linear combination

            Where i=i=1,2,...,p ... , p \, and the constants c1,c2,...,cp c_1, c_2, ... ,c_p can be calculated using the formula:

            Ci=X    ViVi    Vi \large C_i = \frac{X \; \cdot \; V_i}{V_i \; \cdot \; V_i}
            Equation 3: Formula for the constant values of a linear transformation

            Since it is said that an orthogonal set of vectors is a set where all the vectors are linearly independent, we can use this formula for a linear combination to solve for the vector xx defined in equation 2. Therefore, for finding orthogonal vectors within a set, being able to define a set as orthogonal and find the orthogonal basis, we need to follow the next steps:
            1. Determine if the vectors given in the set are orthogonal by computing the dot products of them by pairs.
            2. If the result of each pair of vectors dot product was zero, then the vector set is an orthogonal set.
            3. If the set is orthogonal, then is forms and orthogonal basis.
              1. We can prove this by writing the vector of the linear combination containing the vectors from the set.

            We will take a look at a few examples of such operations in our next section. For exercise example 1 we will need to follow only steps 1 and 2; while for exercise example 2 we will need to follow all three steps, it just depends on whatever you are asked but the process itself is pretty straightforward.

            Now, what happens if you are asked for an orthonormal basis? Notice: NOT orthogonal.
            Well, a set of vectors {v1,v2,...,vn  v_1, v_2, ... ,v_n \; } is an orthonormal set if is an orthogonal set of unit vectors, in other words, all of the vectors within the set must be orthogonal to each other, plus, they must all have a magnitude (length) of one. Having a subspace SS spanned by such orthonormal set we say that the vectors inside of it form an orthonormal basis, and this is easily understood because we already know that all of the vectors inside the set are linearly independent of each other since this is a condition for an orthogonal set.

            Following with the definition of orthonormal, we can check if the column vectors belonging to the columns of a matrix are orthonormal by performing the next matrix multiplication:

            UTU=I \large U^TU = I
            Equation 4: Condition for orthonormality on the columns of a matrix A

            Where UU\, is a m×n \,m \times n \, matrix with columns that are orthonormal vectors. And II is the identity matrix.
            With equation 4 in mind, if we have this m×n \,m \times n \, matrix UU containing orthonormal columns and a pair of vectors xx\, and y  \, y\; in Rn \, R^n \, , then we have the following three equation identities to be true:

            Ux  =  x\parallel Ux \parallel \;= \; \parallel x \parallel

            (Ux)(Uy)=xy (Ux) \cdot (Uy) = x \cdot y

            (Ux)(Uy)=0  (Ux) \cdot (Uy) = 0 \; if and only if xy=0 \, x \cdot y = 0
            Equation 5: Identities of the multiplication of a matrix with orthonormal columns and vectors in Rn

            Now that you know what is orthogonal (and non orthogonal for that matter), how to tell if two vectors are orthogonal and even know how to find orthogonal vectors and sets of them, it is time we work through a few examples.

            Finding orthogonal vectors


            During the next four problems we will use our knowledge on the orthogonal definition applied to vectors, sets, basis and then how it develops into orthonormality. So have fun solving the exercises!

            Example 1

            Having a set of three vectors as shown below:

            Orthogonal sets
            Equation 6: Set of three vectors

            Is this an orthogonal set?

            In order for us to prove if the set is orthogonal, we have to determine whether the given vectors are orthogonal, parallel, or neither. And how do we do that? With the dot product! Remember that the dot product of orthogonal vectors is equal to zero, therefore, we just have to obtain the dot product of all of these vectors with each other and see the results.

            Orthogonal sets

            Orthogonal sets

            Orthogonal sets
            Equation 7: Computing the dot product of the vectors in the set

            Since all of the dot product operations resulted in zeros, then it means that the three vectors in the set are all orthogonal to each other, and therefore, this is an orthogonal set of vectors.

            Example 2

            Given the set BB and the column vector xx as defined below:

            Orthogonal sets
            Equation 8: Set B and vector x

            Verify that BB is an orthogonal basis for R2R^2, and then express xx as a linear combination of the set of vectors in BB.
            So we start by verifying the orthogonality of v1v_1 and v2v_2, the two vectors in the set called BB; for that, we get the dot product v1v2v_1 \cdot v_2:

            Orthogonal sets
            Equation 9: Dot product of vectors v1 and v2

            A result of zero for the dot product tells us that the set BB is a set of orthogonal vectors.
            Then, our next step is to express xx as a linear combination of the set of vectors in BB. Using equation 2 we write the linear combination of the vectors from set BB and express it as vector xx:

            Orthogonal sets
            Equation 10: Linear combination of vectors v1 and v2

            All that is left to complete this expression is to find the values of the constants c1,c2 c_1, c_2. Using equation 3 we have that:

            C1=XV1V1V1=(3)(1)+(3)(3)(1)(1)+(3)(3)=3+91+9=610=35\large C_1 = \frac{X \cdot V_1} {V_1 \cdot V_1} = \frac{(-3)(1) + (3)(3)}{(1)(1) + (3)(3)} = \frac{-3+9}{1+9} = \frac{6}{10} = \frac{3}{5}

            C2=XV2V2V2=(3)(9)+(3)(3)(9)(9)+(3)(3)=27+981+9=3690=25 \large C_2 = \frac{X \cdot V_2} {V_2 \cdot V_2} = \frac{(-3)(-9) + (3)(3)}{(-9)(-9) + (3)(3)} = \frac{27+9}{81+9} = \frac{36}{90} = \frac{2}{5}
            Equation 10: Computing the values of the constants c1, c2

            And so, we rewrite the linear combination including the found values of c1c1\, and c2\, c_2 :

            Orthogonal sets
            Equation 11: Linear combination x


            Example 3

            Having set BB of vectors (as defined below), is set BB an orthonormal basis for R3R^3?

            Orthogonal sets
            Equation 12 : Vector set B

            In order to prove that this is an orthonormal basis for R3R^3, we need to prove this is an orthonormal set. Remember that a set of orthonormal vectors is that which is composed of vectors which are orthogonal to each other and at the same time, they are all unit vectors. So, let us prove the orthogonality of the vectors in the set first:

            Orthogonal sets

            Orthogonal sets

            Orthogonal sets
            Equation 13: Dot product of the vectors in the set


            Example 4

            Let UU, xx \, and y\,y be as defined below, where UU has orthonormal columns and UTU=IU^TU=I.

            Orthogonal sets
            Equation 14: Matrix U and vectors x and y

            Now, verify that:

            (Ux)(Uy)=xy (Ux) \cdot (Uy)= x \cdot y
            Equation 15: Equation to be verified

            We will solve this problem by parts, the left hand side and the right hand side. Since the right hand side will be shorter to solve, lets start by doing that dot product first:

            xy=(3)(6)+(2)(22)=18+4=22 x \cdot y=(3)(6)+(\sqrt{2})(2\sqrt{2})=18+4=22
            Equation 16: Dot product of vectors x and y

            Now working through the right hand side of the equation to verify:

            Orthogonal sets

            (Ux)(Uy)=(1)(2)+(1)(2)+(3)(6)=2+2+18=22 (Ux) \cdot (Uy)=(-1)(-2)+(-1)(-2)+(3)(6)=2+2+18=22
            Equation 17: Dot product of the multiplication of matrix U with the column vectors x and y

            Notice the results from both sides of equation 15 yield the value of 22, and therefore, the expression is proved correct.

            ***

            Now that you have learnt about orthogonality and vector sets with this characteristic, we recommend you to visit this lesson on orthogonal sets which includes a variety or detailed problems that can supplement your studies.
            In our next lesson we will introduce the concept of orthogonal projection, which of course is closely tied to the definitions we saw during this lesson. So we hope that you have enjoyed our lesson of today, and this is it, see you in the next one!
            A set of vectors {v1,,vnv_1,\cdots,v_n} in Rn\Bbb{R}^n are orthogonal sets if each pair of vectors from the set are orthogonal. In other words,
            vivj=0v_i \cdot v_j =0
            Where iji \neq j.

            If the set of vectors {v1,,vnv_1,\cdots,v_n} in Rn\Bbb{R}^n is an orthogonal set, then the vectors are linearly independent. Thus, the vectors form a basis for a subspace SS. We call this the orthogonal basis.

            To check if a set is an orthogonal basis in Rn\Bbb{R}^n, simply verify if it is an orthogonal set.
            y=c1v1+c2v2++cpvpy=c_1 v_1+c_2 v_2+\cdots+c_p v_p

            Are calculated by using the formula:
            ci=yvivivic_i = \frac{y \cdot v_i}{v_i \cdot v_i}
            where i=1,,pi=1,\cdots,p.

            A set {v1,,vpv_1,\cdots,v_p}is an orthonormal set if it's an orthogonal set of unit vectors.

            If SS is a subspace spanned by this set, then we say that {v1,,vpv_1,\cdots,v_p} is an orthonormal basis. This is because each of the vectors are already linear independent.

            A m×nm \times n matrix UU has orthonormal columns if and only if UTU=IU^T U=I.

            Let UU be an m×nm \times n matrix with orthonormal columns, and let xx and yy be in Rn\Bbb{R}^n. Then the 3 following things are true:
            1) Ux=x\lVert Ux \rVert = \lVert x \rVert
            2) (Ux)(Uy)=xy (Ux) \cdot (Uy)=x \cdot y
            3) (Ux)(Uy)=0(Ux) \cdot (Uy)=0 if and only if xy=0x \cdot y =0

            Consider LL to be the subspace spanned by the vector vv . Then the orthogonal projection of yy onto vv is calculated to be:
            y^=\hat{y}=projLy=yvvvv_Ly=\frac{y \cdot v}{v \cdot v}v

            The component of yy orthogonal to vv (denoted as zz) would be:
            z=yy^z=y-\hat{y}