Orthogonal projections - Orthogonality and Least Squares

Do better in math today
Get Started Now

Orthogonal projections


The Orthogonal Decomposition Theorem
Let SS be a subspace in Rn\Bbb{R}^n. Then each vector yy in Rn\Bbb{R}^n can be written as:


where y^\hat{y} is in SS and zz is in SS^{\perp}. Note that y^\hat{y} is the orthogonal projection of yy onto SS

If {v1,,vpv_1,\cdots ,v_p } is an orthogonal basis of SS, then

projsy=y^=yv1v1v1v1+yv2v2v2v2++yvpvpvpvp_sy=\hat{y}=\frac{y \cdot v_1}{v_1 \cdot v_1}v_1 + \frac{y \cdot v_2}{v_2 \cdot v_2}v_2 + \cdots + \frac{y \cdot v_p}{v_p \cdot v_p}v_p

However if {v1,,vpv_1,\cdots ,v_p } is an orthonormal basis of SS, then

projsy=y^=(yv1)v1+(yv2)v2++(yvp)vp_sy=\hat{y}=(y \cdot v_1)v_1+(y \cdot v_2)v_2 + \cdots + (y \cdot v_p)v_p

Property of Orthogonal Projection
If {v1,,vpv_1,\cdots ,v_p } is an orthogonal basis for SS and if yy happens to be in SS, then

In other words, if y is in S=S=Span{v1,,vpv_1,\cdots ,v_p}, then projSy=y_S y=y.

The Best Approximation Theorem
Let SS be a subspace of Rn\Bbb{R}^n. Also, let yy be a vector in Rn\Bbb{R}^n, and y^\hat{y} be the orthogonal projection of yy onto SS. Then yy is the closest point in SS, because

yy^\lVert y- \hat{y} \rVert < yu\lVert y-u \rVert

where uu are all vectors in SS that are distinct from y^\hat{y}.
  • Intro Lesson
    Orthogonal Projections Overview:
Teacher pug

Orthogonal projections

Don't just watch, practice makes perfect.

We have over 70 practice questions in Linear Algebra for you to master.