Eigenvalue and eigenvectors

All in One Place

Everything you need for JC, LC, and college level maths and science classes.

Learn with Ease

We’ve mastered the national curriculum so that you can revise with confidence.

Instant Help

24/7 access to the best tips, walkthroughs, and practice exercises available.

0/3
?
Intros
Lessons
  1. Eigenvalues and Eigenvectors Overview:
  2. Definition of Eigenvalues and Eigenvectors
    • What are eigenvectors?
    • What are eigenvalues?
  3. Verifying Eigenvalues/Eigenvectors of a Matrix
    • Eigenvectors: Show that Ax=λxAx=\lambda x
    • Eigenvalues: Get a non-trivial solution for (AλI)x=0(A-\lambda I)x=0
    • Finding a eigenvector given an eigenvalue
  4. Eigenspace
    • What is an eigenspace?
    • Finding a basis for the eigenspace
0/6
?
Examples
Lessons
  1. Verifying Eigenvectors
    Let Verifying Eigenvectors. Is is this vector an eigenvector 1 an eigenvector of AA? If so, find the eigenvalue. What about is this vector an eigenvector 2?
    1. Let Verifying Eigenvectors. Is is this vector an eigenvector 3 an eigenvector of AA? If so, find the eigenvalue.
      1. Verifying Eigenvalues and finding a corresponding eigenvector
        Let Verifying Eigenvalues and finding a corresponding eigenvector. Is λ=1\lambda=1 an eigenvalue of AA? If so, find a corresponding eigenvector.
        1. Finding a Basis for the Eigenspace
          Find a basis for the corresponding eigenspace if:
          Finding a Basis for the Eigenspace
          1. Proof Related to Eigenvalues and Eigenvectors
            Prove that if A2A^2 is the zero matrix, then the only eigenvalue of AA is 0.
            1. Let λ\lambda be an eigenvalue of an invertible matrix AA. Then λ1\lambda ^{-1} is an eigenvalue of A1A^{-1}.
              Topic Notes
              ?

              Eigenvalues and Eigenvectors


              During our lesson about the image and range of linear transformations, we learnt that a linear transformation T(x)=AxT(x)=Ax re-scales a vector without changing its direction. Now it is time to talk about that vector xx, an eigenvector, where does it come from and how is it related to the given matrix AA.

              What are eigenvalues and eigenvectors


              In order to understand what an eigenvalue is we have to define an eigenvector first.
              For any given square matrix AA, an eigenvector is a non-zero vector xx such that Ax=λxAx= \lambda x, for some scalar . In other words, an eigenvector is a non-zero characteristic vector (column vector) which changes by a constant factor (which we name "lambda"= λ \lambda ) when a square matrix is applied to it through matrix multiplication.

              Then the eigenvalue is this constant factor, the eigenvalue is λ \lambda .

              Given this, now we know that all nxn matrices happen to have an eigenvector that corresponds to an eigenvalue. And so, for this condition of correspondence between an eigenvector and eigenvalue is defined as:

              Ax=λxAx= \lambda x
              Equation 1: Condition of correspondence between eigenvector and eigenvalue

              we say that the eigenvector xx corresponds to the eigenvalue λ \lambda .

              • Verifying Eigenvectors
              In order to verify if any given column vector x happens to be the eigenvector of a given square matrix AA we use the condition shown in equation 1, but before going into the multiplication of matrix AA and vector xx, there are a few simple things we need to check from this matrix and vector.

              First, the column vector xx given MUST have the same amount of entries as the matrix AA has rows, if not, the multiplication AxAx is not possible. Second, the vector xx cannot be a zero vector. This second requirement will come back when we are verifying eigenvalues in the next part of this section.

              The steps to verify if a given column vector xx happens to be an eigenvector of the matrix AA are:
                • Multiply the matrix AA times the vector xx.
                  • The resulting vector should be equal to the multiplication of λ\lambda times the vector xx.
                • A constant value should be possible to be factored out of the resulting vector in order to leave the multiplication of this constant value times the vector xx. This value will be λ\lambda.
                  • In other words, you should be able to factor out the vector x from the vector resulting from AxAx.
                  • Once you have factored out the vector xx, there will be a constant factor multiplying it.
                  • This constant factor will be the eigenvalue λ\lambda .

                • If there is no constant value that can be factored out of the resulting vector from Ax, which will produce a multiplication with the vector x, then it means that x is NOT an eigenvector.

              We will demonstrate this process with a few exercises on a later section of this article, for now, let us look at the verification process for eigenvectors.

              • Verifying Eigenvalues
              In order to verify if the value of λ\lambda is an eigenvalue we need to solve the relationship:

              (AIλ)x=0(A-I\lambda)x=0
              Equation 2: Matrix equation for verification of eigenvalues

              And find a non-trivial solution. Here the II represents the identity matrix, therefore, the multiplication Iλ\lambda always produces a diagonal square matrix where the elements of its diagonal are all λ\lambda and the rest of the values in the matrix are all zeros.

              In order for the λ\lambda value of to be an eigenvalue the resulting vector must contain a non-trivial solution because a trivial solution means that this resulting vector is a zero vector. It so happens that this resulting vector is the column vector xx (from equation 1) which is the corresponding eigenvector of the λ\lambda value provided. Therefore, if the result is a trivial solution this would mean that xx is a zero vector, which we know (from the last part of this section) that it cannot be a zero vector.

              And so, the value of λ\lambda is an eigenvalue of the matrix AA ONLY when the resulting vector xx has a non-trivial solution.

              The steps to verify if a given λ\lambda happens to be an eigenvalue of the matrix AA are:
                • Perform the subtraction between the matrices AA and IλI \lambda (dont forget to substitute λ\lambda with its constant value) to obtain a new matrix.
                • The multiplication of the new matrix with vector xx can be rewritten into an augmented matrix.
                  • Row reduce the augmented matrix into an echelon form in order to find the components of the vector xx.
                  • Once you find the components of xx, if the result is a non-trivial solution, then it means λ\lambda is an eigenvalue of AA and its corresponding eigenvector is xx.

                • If a trivial solution is found, then it means λ\lambda is not an eigenvalue of AA.

              Once more, the verification process process will be shown in the exercises later during our lesson.

              Calculating eigenvalues and eigenvectors


              The calculation of eigenvalues is done through an algebraic expression called the characteristic polynomial equation:

              det(AIλ)=0det(A-I \lambda)=0
              Equation 3: Characteristic polynomial equation

              We will dedicate our next lesson to learn all about the characteristic equation, still, we will briefly explain the way how to do the calculations to obtain eigenvectors and eigenvalues using it, but will not go into detail of its origin or its usage in other cases.

              Notice in equation 3 that inside of the determinant of the characteristic equation, there is a subtraction of the given n×nn \times n square matrix AA and the scalar multiplication of the identity matrix and the eigenvalue λ\lambda . That subtraction will produce another matrix, therefore, once we compute the determinant of that matrix the result will be an algebraic second degree expression where λ\lambda will appear as a variable, and for which we have to solve its roots.

              And so, the computation of eigenvalues happens to be a simple calculation that can be done using the quadratic formula.
              But this lesson is called eigenvalues and eigenvectors for a reason: Since an eigenvalue is a root of a quadratic equation, we can automatically understand there is usually more than one eigenvalue associated to a given square matrix. With that in mind, you should know that for each eigenvalue there is a corresponding eigenvector. And so, the main purpose of obtaining the eigenvalues of a matrix is to then produce its set of corresponding characteristic column vectors, its eigenvectors. For that we go back to equation 2.

              Once you have calculated the eigenvalues associated to an square matrix, the process to obtain its corresponding eigenvectors is just what we have called the steps for verifying eigenvalues in our last section (remember you will be computing the components of vector x during that process? That is basically finding the vector x which is the eigenvector associated to the given eigenvalue). In summary, the complete steps to find all eigenvalues and eigenvectors of the given matrix are:

              1. Find the eigenvalues using the characteristic equation described in equation 3
                1. Calculate the roots for the resulting quadratic equation from the determinant using the quadratic formula.

              2. Use the eigenvalues found in order to compute the eigenvectors through equation 2.
                1. Transform the matrix equation (AIλ)x=0(A-I \lambda )x=0 into an augmented matrix.
                2. Use the row-reduction method in order to obtain the final column vector xx.
                3. x is the eigenvector associated to λ\lambda.

              3. Repeat step 2 for each eigenvalue we found in step 1, and so, produce at least the same amount of eigenvectors as eigenvalues found.

              That pretty much sums up the complete process to calculate eigenvalues and eigenvectors, as you can see, there is really not much more to say about how to find the eigenvalues and eigenvectors of the matrix AA. Still, we will retake this topic in our next lesson, since the characteristic equation in linear algebra is mainly used to find eigenvalues and eigenvectors, thus, the process will be re-described adding any appropriate detail that may not be useful for the moment.

              Before we continue onto our next section where we will be working through eigenvalues and eigenvectors examples, let us talk a little bit about what is an eigenspace:
              The eigenspace is the set of all the solutions for the equation (AIλ)x=0(A-I \lambda)x=0, thus, its basis is represented as the general solution containing the collection of all of the vectors x that satisfy that equation (equation 2). Therefore, an eigenspace provides a look at all of the possible eigenvalues and their corresponding eigenvectors by conforming a set of vectors where the elements inside of it are the zero vector and all of the possible eigenvectors of the given matrix. In simple words, the eigenspace of a matrix is the null space of the matrix resulting from AIλA-I\lambda .

              The general steps to follow in order to find the basis for the eigenspace corresponding to a given matrix and an eigenvalue are:

              1. Set up the equation (AIλ)x=0(A-I\lambda )x=0.
                1. Calculate the resulting matrix from AIλA-I\lambda.
                2. Use such matrix to form an augmented matrix in order to solve for xx
                3. When finding the components of vector xx, use all of the free variables to conform the set of possible vectors for the general solution of xx.
              2. Repeat this for every eigenvalue available for the given matrix.

              This process will be shown in example 4 in our next section.

              Eigenvalues and eigenvectors examples


              Example 1

              If the square matrix AA and the vector x are defined as follows:

              Eigenvalues and Eigenvectors
              Equation 4: Matrix A and column vector x

              • Is xx an eigenvector of AA? If so, find the eigenvalue.
              We start by computing the AxAx multiplication as shown in equation 1 for the condition of correspondence between an eigenvalue and an eigenvector.

              Eigenvalues and Eigenvectors
              Equation 5: Multiplication of matrix A and column vector x

              The column vector resulting from equation 5 is equal to λx\lambda x, and so, we need to find a scalar that can be factored out of the vector in order for this condition (equation 1) to be satisfied:

              Eigenvalues and Eigenvectors
              Equation 6: Finding the eigenvalue corresponding to vector x

              And so,λ=6 \lambda =6 is the eigenvalue corresponding to vector xx, proving that xx is an eigenvector.
              • If we are to have another column vector named y which is defined as:


              Eigenvalues and Eigenvectors
              Equation 7: Vector y

              Is yy an eigenvector of AA?
              We work with the correspondence condition once more as follows:

              Eigenvalues and Eigenvectors
              Equation 8: Finding if there is an eigenvalue corresponding to vector y

              Notice there is no possible value of λ \lambda that could satisfy the condition, and so, vector yy is NOT and eigenvector for matrix AA.

              Example 2

              Having the following 3x3 matrix AA and column vector xx:

              Eigenvalues and Eigenvectors
              Equation 9: Matrix A and vector x

              Is xx an eigenvector of AA? If so, find the corresponding eigenvalue.
              Once more, we start by computing the left hand side of the condition Ax=λxAx= \lambda x:

              Eigenvalues and Eigenvectors

              Eigenvalues and Eigenvectors
              Equation 10: Multiplication of matrix A and column vector x

              Using the resulting column vector, lets see if we can factor out a scalar to satisfy the condition of correspondence, and find an eigenvalue:

              Eigenvalues and Eigenvectors
              Equation 11: Finding the eigenvalue corresponding to vector x

              Therefore, xx is an eigenvector, and its corresponding eigenvalue isλ=5\lambda =5.

              And this is how we find the eigenvalues and eigenvectors for the coefficient matrix AA. The next example shows a similar problem, but instead of starting with a known eigenvector, we start with a potential eigenvalue:

              Example 3

              Having the following 3x3 matrixAA:

              Eigenvalues and Eigenvectors
              Equation 12: Matrix A

              Is λ=1\lambda =1 an eigenvalue of AA? If so, find its corresponding eigenvector of AA.
              In order to verify an eigenvalue, we use equation 2: (AIλ)x=0(A-I\lambda)x=0. Therefore, let us start by computing the subtraction of matrices AIλA-I\lambda:

              Eigenvalues and Eigenvectors
              Equation 13: Matrix subtraction A-Iλ\lambda

              Notice how we have equated the subtraction AIλA-I\lambda to zero, this is because in order for xx to be an eigenvector, xx must be a non-zero vector. Therefore, to satisfy (AIλ)x=0(A-I\lambda )x=0 it means that the matrix resulting from the subtraction shown in equation 12 MUST be equal to zero: AIλ=0A-I\lambda=0

              And so, we can convert the relationship (AIλ)x=0(A-I\lambda)x=0 into an augmented matrix to solve for xx:

              Eigenvalues and Eigenvectors
              Equation 14: Augmented matrix

              Row reducing the augmented matrix in order to find the vector xx:

              Eigenvalues and Eigenvectors

              Eigenvalues and Eigenvectors

              Eigenvalues and Eigenvectors
              Equation 15: Row reducing the augmented matrix

              From equation 14 notice that x3x_3 is a free variable since it doesnt have a corresponding pivot, and so, we have obtained the components for vector xx:

              Eigenvalues and Eigenvectors
              Equation 16: Vector x

              Since x3x_3 is a free variable, we can set it up to any value we want, in this case, for practicality, we set x3=1x_3 = 1 and our resulting vector xx is:

              Eigenvalues and Eigenvectors
              Equation 17: Final vector x

              As we can observe from example 2 and 3, that finding eigenvalues and eigenvectors go hand in hand, and so, most problems expect you to find either the eigenvalues and eigenvector associated to them, or vice versa.

              Example 4

              Find a basis for the corresponding eigenspace if:

              Eigenvalues and Eigenvectors
              Equation 18: Matrix A and eigenvalue 2

              We calculate the matrix subtraction AIλA-I\lambda

              Eigenvalues and Eigenvectors
              Equation 19: Matrix subtraction A-Iλ\lambda

              Using equation 2: (AIλ)x=0(A-I\lambda )x=0, we set up an augmented matrix in order to solve for the vector xx:

              Eigenvalues and Eigenvectors

              Eigenvalues and Eigenvectors
              Equation 20: Producing and augmented matrix and Row reducing to find the components of x

              From equation 20 we can observe that x2x_2 and x3x_3 are both free variables. So we solve for x1x_1:

              X1=x2    2x35=15x225x3\large X_1 = \frac{-x_2 \; - \; 2 x_3} {5} = - \frac{1} {5}x_2 - \frac{2}{5}x_3
              Equation 21: Solving for x1

              And we can finally write down vector xx by setting the free variables both equal to 1.

              Eigenvalues and Eigenvectors
              Equation 22: Vector x

              And the basis for the eigenspace is:

              Eigenvalues and Eigenvectors
              Equation 23: Basis for the eigenspace


              ***

              Now that we have seen the definition of eigenvalues and eigenvectors, along with what an eigenspace is, let us comment a little bit about the uses of eigenvalues and eigenvectors: When studying rotational motion of rigid objects, the concept of eigenvectors determines the principal axes through which a body can be rotated (change its state) following the principles of Newton laws of motion, thus, the concepts of eigenvalues and eigenvectors contribute to the toolbox when working on problems for the mechanics of bodies.

              And so, after learning the core process on how to find eigenvalues and eigenvectors of a matrix, we finalize this section by providing you with a few recommendations on links we think may be useful for your independent studies: First we suggest you to visit this review on eigenvalues and eigenvectors, since it contains a set of extensively explained problems. This other linear algebra article on eigenvalues and eigenvectors shows the process using symmetric matrices.

              This is it for this lesson, see you on the next one!
              An eigenvector of an n×nn \times n matrix AA is a non-zero vector xx such that Ax=λxAx= \lambda x, for some scalar λ\lambda. The scalar λ\lambda is called the eigenvalue.

              We say the eigenvector xx corresponds to the eigenvalue λ\lambda.

              Given an eigenvalue λ\lambda of matrix AA, we can find a corresponding eigenvector xx by solving
              (AλI)x=0(A-\lambda I)x=0
              And finding a non-trivial solution xx.

              The eigenspace is the null space of the matrix AλIA-\lambda I. In other words, the eigenspace is a set of all solutions for the equation
              (AλI)x=0(A-\lambda I)x=0

              Of course, we can find the basis for the eigenspace by finding the basis of the null space of AλIA-\lambda I.