Orthogonal sets

Get the most by viewing this topic in your current grade. Pick your course now.

?
Intros
Lessons
  1. Orthogonal Sets Overview:
  2. Orthogonal Sets and Basis
    • Each pair of vector is orthogonal
    • Linear independent → Form a Basis
    • Calculate weights with Formula
  3. Orthonormal Sets and Basis
    • Is an orthogonal set
    • Each vector is a unit vector
    • Linear independent → Form a Basis
  4. Matrix UU with Orthonormal columns and Properties
    UTU=IU^T U=I
    • 3 Properties of Matrix UU
  5. Orthogonal Projection and Component
    • Orthogonal Projection of yy onto vv
    • The component of yy orthogonal to vv
?
Examples
Lessons
  1. Orthogonal Sets and Basis
    Is this an orthogonal set?
    Is this an orthogonal set
    1. Verify that Verify that this is an orthogonal basis for R^2 is an orthogonal basis for R2\Bbb{R}^2, and then express express it as a linear combination of the set of vectors in B as a linear combination of the set of vectors in BB.
      1. Orthonormal Sets/Basis
        Is set BB is an orthonormal basis for R3\Bbb{R}^3?
        Is set B is an orthonormal basis for R^3?
        1. Let Matrix U, x, and y where UU has orthonormal columns and UTU=IU^TU=I. Verify that
          (Ux)(Uy)=xy(Ux)\cdot (Uy)=x\cdot y
          1. Orthogonal Projection
            Let vector y and vector v. Write yy as the sum of two orthogonal vectors, one in Span{vv} and one orthogonal to vv.
            Topic Notes
            ?

            Introduction to Orthogonal Sets

            Welcome to our exploration of orthogonal sets, a fundamental concept in linear algebra! Orthogonal sets are collections of vectors that are perpendicular to each other, forming right angles in multi-dimensional space. These sets are crucial in linear algebra because they simplify many calculations and provide a powerful framework for understanding vector spaces. The term "orthogonal" comes from the Greek words "orthos" (straight) and "gonia" (angle), perfectly describing the relationship between these vectors. Our introduction video delves into this concept, offering visual representations that make it easier to grasp. You'll discover how orthogonal sets relate to the dot product of vectors and why they're so important in fields like computer graphics, signal processing, and data analysis. As we progress, you'll see how orthogonal sets lead to orthonormal bases, which are essential for many advanced topics in linear algebra. So, let's dive in and unravel the fascinating world of orthogonal sets together!

            Understanding Orthogonality in Vector Sets

            What is Orthogonality in Vectors?

            Orthogonality is a fundamental concept in linear algebra and vector spaces. Two vectors are considered orthogonal if they are perpendicular to each other. In simpler terms, orthogonal vectors form a 90-degree angle between them. This concept extends beyond just two dimensions and applies to vectors in any n-dimensional space.

            The Mathematical Definition of Orthogonality

            Mathematically, two vectors u and v are orthogonal if their dot product equals zero. The dot product, also known as the scalar product, is a way to multiply vectors that results in a scalar value. For orthogonal vectors, this scalar value is always zero. This can be expressed as:

            u · v = 0

            Where · represents the dot product operation.

            How to Determine if Vectors are Orthogonal

            To determine if vectors are orthogonal, follow these steps:

            1. Calculate the dot product of the vectors.
            2. If the result is zero, the vectors are orthogonal.
            3. If the result is not zero, the vectors are not orthogonal.

            For example, let's consider two vectors in 2D space: u = (3, 4) and v = (-4, 3). To check if they're orthogonal:

            u · v = (3 × -4) + (4 × 3) = -12 + 12 = 0

            Since the dot product is zero, these vectors are orthogonal.

            Properties of Orthogonal Vectors

            Orthogonal vectors have several important properties:

            • They are linearly independent, meaning no vector in the set can be expressed as a linear combination of the others.
            • They form a basis for a vector space, which is useful in many mathematical and practical applications.
            • The Pythagorean theorem applies to orthogonal vectors.

            Orthogonality in Higher Dimensions

            While it's easy to visualize orthogonality in 2D or 3D spaces, the concept extends to higher dimensions. In an n-dimensional space, two vectors u = (u1, u2, ..., un) and v = (v1, v2, ..., vn) are orthogonal if:

            u1v1 + u2v2 + ... + unvn = 0

            Applications of Orthogonality

            Orthogonality is crucial in various fields:

            • Signal Processing: Orthogonal signals don't interfere with each other, allowing for efficient data transmission.
            • Computer Graphics: Orthogonal vectors are used to define coordinate systems and transformations.
            • Statistics: Orthogonal variables in regression analysis help avoid multicollinearity issues.
            • Quantum Mechanics: Orthogonal wave functions represent distinct quantum states.

            Orthogonal Example: The Standard Basis

            A classic example of orthogonal vectors is the standard basis in 3D space:

            i = (1, 0, 0)

            j = (0, 1, 0)

            k = (0, 0, 1)

            These vectors are mutually orthogonal, meaning each is orthogonal to the other two. You can verify this by calculating their dot products:

            i · j = 0, j · k = 0, i · k = 0

            Conclusion

            Characteristics of Orthogonal Sets

            Orthogonal sets are fundamental in linear algebra and have numerous applications in mathematics, physics, and engineering. An orthogonal set is a collection of vectors where each vector is perpendicular to every other vector in the set. This unique property makes orthogonal sets particularly useful in various mathematical and practical contexts.

            Key characteristics of orthogonal sets include:

            • Perpendicularity: Each vector in the set is perpendicular to every other vector.
            • Linear independence: Orthogonal vectors are always linearly independent.
            • Simplification of calculations: Orthogonal sets simplify many mathematical operations.
            • Basis formation: Orthogonal sets can form a basis for a vector space.

            To check if a set of vectors is orthogonal, follow these steps:

            1. Calculate the dot product of each pair of vectors in the set.
            2. If the dot product of any pair is zero, those vectors are orthogonal.
            3. Repeat this process for all possible pairs in the set.
            4. If all dot products are zero, the entire set is orthogonal.

            Let's consider a practical example to reinforce understanding. Suppose we have three vectors in R³:

            v = (1, 0, 0), v = (0, 1, 0), v = (0, 0, 1)

            To check if these vectors form an orthogonal set:

            1. Calculate v · v = (1)(0) + (0)(1) + (0)(0) = 0
            2. Calculate v · v = (1)(0) + (0)(0) + (0)(1) = 0
            3. Calculate v · v = (0)(0) + (1)(0) + (0)(1) = 0

            Since all dot products are zero, this set of vectors is orthogonal. This example demonstrates the standard basis vectors in R³, which are always orthogonal.

            Another example in R² could be:

            u = (3, 4) and v = (-4, 3)

            Checking orthogonality:

            u · v = (3)(-4) + (4)(3) = -12 + 12 = 0

            Therefore, u and v are orthogonal.

            Understanding orthogonal sets is crucial in various applications, such as:

            • Coordinate systems: Orthogonal vectors form the basis of Cartesian coordinates.
            • Signal processing: Orthogonal functions are used in Fourier analysis.
            • Quantum mechanics: Orthogonal states represent distinct quantum states.
            • Computer graphics: Orthogonal matrices are used in 3D transformations.

            By mastering the concept of orthogonal sets and how to check for orthogonality, you'll be well-equipped to tackle advanced topics in linear algebra and its applications across various fields of science and engineering.

            Finding Orthogonal Vectors

            Understanding how to find orthogonal vectors is a crucial skill in linear algebra and geometry. Orthogonal vectors are perpendicular to each other, forming a 90-degree angle. This concept is fundamental in various mathematical and scientific applications. In this guide, we'll explore methods for finding orthogonal vectors and provide examples to illustrate the process.

            One of the primary techniques used to determine orthogonal vectors is the inner product method. The inner product, also known as the dot product, is a powerful tool in vector algebra. Two vectors are considered orthogonal if their inner product equals zero. This property forms the basis for finding orthogonal vectors.

            To find orthogonal vectors using the inner product method, follow these steps:

            1. Choose a vector or set of vectors.
            2. Calculate the inner product of the chosen vector(s) with potential orthogonal vectors.
            3. Set the inner product equal to zero and solve for the unknown components.

            Let's consider an orthogonal example in two-dimensional space. Suppose we have a vector a = (3, 4) and want to find a vector b that is orthogonal to it. We can represent b as (x, y), where x and y are unknown. The inner product of a and b should equal zero:

            3x + 4y = 0

            From this equation, we can derive multiple solutions. One possible orthogonal vector is b = (-4, 3), as it satisfies the equation. Another solution could be b = (4, -3). Both these vectors are perpendicular to a = (3, 4).

            In three-dimensional space, the process is similar but involves an additional component. For instance, if we have a vector a = (1, 2, 3), we can find an orthogonal vector b = (x, y, z) by solving the equation:

            1x + 2y + 3z = 0

            This equation has infinitely many solutions, allowing us to choose values for two components and solve for the third.

            Another method for finding orthogonal vectors is the cross product technique, which is specific to three-dimensional space. The cross product of two vectors results in a vector that is orthogonal to both input vectors. This method is particularly useful when you need to find a vector perpendicular to two given vectors.

            For example, if we have vectors a = (1, 0, 0) and b = (0, 1, 0), their cross product a × b = (0, 0, 1) is orthogonal to both a and b.

            In higher-dimensional spaces, the Gram-Schmidt process is a powerful method for finding orthogonal vectors. This process takes a set of linearly independent vectors and transforms them into a set of orthogonal vectors. It's particularly useful in creating orthonormal bases for vector spaces.

            To illustrate the Gram-Schmidt process, consider two vectors in three-dimensional space: u1 = (1, 1, 1) and u2 = (1, 2, 3). The process would involve the following steps:

            1. Keep the first vector as is: v1 = u1 = (1, 1, 1)
            2. Subtract the projection of u2 onto v1 from u2 to get v2
            3. Normalize the resulting vectors if needed

            This process ensures that v1 and v2 are orthogonal to each other.

            In practical applications, such as computer graphics and physics simulations, finding orthogonal vectors is essential for creating coordinate systems, defining rotations, and solving complex equations. The ability to determine orthogonal vectors efficiently is a valuable skill in many fields of science and engineering.

            In conclusion, mastering the techniques for finding

            Applications of Orthogonal Sets

            Orthogonal sets play a crucial role in various fields, offering practical solutions to complex problems. In physics, engineering, and computer graphics, the concept of orthogonality proves invaluable. Let's explore some real-world applications of orthogonal vectors and their significance.

            In physics, orthogonal sets are fundamental to understanding and describing motion. For instance, in classical mechanics, the position, velocity, and acceleration of an object can be represented using orthogonal vectors in a three-dimensional coordinate system. This orthogonal representation allows physicists to analyze and predict the object's behavior more efficiently. Similarly, in quantum mechanics, orthogonal wave functions are essential for describing the quantum states of particles, enabling scientists to study and manipulate subatomic phenomena.

            Engineering applications of orthogonal sets are widespread. In electrical engineering, orthogonal frequency-division multiplexing (OFDM) is a technique used in wireless communication systems. OFDM utilizes orthogonal subcarriers to transmit multiple signals simultaneously, increasing data transmission rates and spectral efficiency. This technology is crucial in modern Wi-Fi, 4G, and 5G networks. In structural engineering, orthogonal sets help in analyzing and designing buildings to withstand various forces, such as wind loads and seismic activity.

            Computer graphics heavily rely on orthogonal vectors for rendering 3D objects and scenes. The use of orthogonal basis vectors in 3D modeling allows for efficient representation and manipulation of objects in virtual space. This is particularly important in video game development, computer-aided design (CAD), and animation. Orthogonal projections are also used in creating 2D views of 3D objects, which is essential in technical drawing and architectural design.

            In signal processing, orthogonal transforms like the Fourier transform and wavelet transform are indispensable tools. These transforms decompose signals into orthogonal components, enabling efficient analysis, compression, and filtering of data. This has applications in image and audio processing, data compression algorithms, and noise reduction techniques.

            Machine learning and data science also benefit from orthogonal sets. Principal Component Analysis (PCA), a widely used dimensionality reduction technique, relies on finding orthogonal vectors that capture the most significant variations in data. This allows for more efficient data representation and analysis, crucial in fields like facial recognition, gene expression analysis, and financial modeling.

            In conclusion, the applications of orthogonal sets extend far beyond theoretical mathematics. From the fundamental laws of physics to cutting-edge technologies in communication and computing, orthogonality provides a powerful framework for solving complex problems. As technology continues to advance, the importance of understanding and applying orthogonal vectors in various fields will only grow, making it an essential concept for students, researchers, and professionals alike.

            Orthonormal Sets: A Special Case

            In the realm of linear algebra, orthonormal sets stand out as a special and particularly useful case of orthogonal sets. To understand orthonormal sets, it's essential to first grasp the concept of orthogonal sets and then explore the additional property that makes a set orthonormal.

            Orthogonal sets are collections of vectors where each vector is perpendicular to every other vector in the set. This perpendicularity is mathematically expressed through the dot product of any two distinct vectors in the set being zero. Orthogonal sets are valuable in many mathematical and practical applications due to their unique properties.

            Orthonormal sets take the concept of orthogonality a step further by introducing an additional constraint: all vectors in the set must be unit vectors. A unit vector is a vector with a magnitude (length) of 1. This additional property transforms an orthogonal set into an orthonormal set, making it even more powerful and convenient for various mathematical operations.

            To determine if a set is orthonormal, two conditions must be met:

            1. Orthogonality: The dot product of any two distinct vectors in the set must be zero.
            2. Unit length: Each vector in the set must have a magnitude of 1.

            Mathematically, for a set of vectors {v, v, ..., v} to be orthonormal:

            • v · v = 0 for all i j (orthogonality condition)
            • ||v|| = 1 for all i (unit length condition)

            Let's illustrate the difference between orthogonal and orthonormal sets with examples:

            Example 1 (Orthogonal set): Consider the vectors a = (3, 0) and b = (0, 4) in R². These vectors are orthogonal because their dot product is zero: a · b = 3(0) + 0(4) = 0. However, this set is not orthonormal because the vectors are not unit vectors (||a|| = 3, ||b|| = 4).

            Example 2 (Orthonormal set): Now, let's modify the previous set to make it orthonormal. We can normalize the vectors by dividing each by its magnitude: a' = (1, 0) and b' = (0, 1). This new set {a', b'} is orthonormal because:

            • a' · b' = 0 (orthogonality condition is met)
            • ||a'|| = ||b'|| = 1 (unit length condition is met)

            The transition from orthogonal to orthonormal sets brings several advantages. Orthonormal sets simplify many calculations in linear algebra, such as projections, coordinate transformations, and the construction of orthonormal bases. They play a crucial role in various applications, including quantum mechanics, signal processing, and computer graphics.

            In practice, it's often easier to start with an orthogonal set and then normalize each vector to create an orthonormal set. This process, known as orthonormalization, is commonly achieved through methods like the Gram-Schmidt process.

            Understanding the distinction between orthogonal and orthonormal sets is crucial for students and practitioners of linear algebra. While both types of sets offer the benefit of perpendicularity, orthonormal sets provide the additional advantage of unit length vectors, making them a powerful tool in various mathematical and practical applications.

            Conclusion

            In this exploration of orthogonal sets, we've delved into a crucial concept in linear algebra. Orthogonality, characterized by vectors with zero dot products, forms the foundation for understanding orthogonal sets. These sets possess unique properties that make them invaluable in various mathematical applications. We've seen how orthogonal sets simplify calculations, provide efficient basis representations, and play a vital role in advanced linear algebra topics. Understanding orthogonality is essential for anyone studying linear algebra, as it underpins many advanced concepts and practical applications. If you feel you need a refresher, we encourage you to rewatch the introduction video for a comprehensive overview. To deepen your understanding of orthogonal sets and their applications, explore additional resources and practice problems. Remember, mastering orthogonality will significantly enhance your grasp of linear algebra as a whole. Don't hesitate to engage with further materials and join discussions to solidify your knowledge in this fundamental area of mathematics.

            Orthogonal Sets Overview:

            Orthogonal Sets Overview: Orthogonal Sets and Basis
            • Each pair of vector is orthogonal
            • Linear independent → Form a Basis
            • Calculate weights with Formula

            Step 1: Understanding Orthogonal Sets

            Orthogonal sets are collections of vectors where each pair of vectors is orthogonal. This means that the inner product (or dot product) of any two distinct vectors in the set is zero. For example, if you have vectors v1,v2,,vnv_1, v_2, \ldots, v_n, then for any iji \neq j, the inner product vivj=0v_i \cdot v_j = 0. This property is crucial because it ensures that the vectors are mutually perpendicular, which simplifies many calculations in linear algebra.

            Step 2: Verifying Orthogonality

            To verify if a set of vectors is orthogonal, you need to check the inner product of each pair of vectors. For instance, if you have vectors v1=(3,1)v_1 = (3, 1) and v2=(1,3)v_2 = (-1, 3), you calculate their inner product as follows:

            • Multiply the corresponding entries: 3×(1)+1×33 \times (-1) + 1 \times 3
            • Add the results: 3+3=0-3 + 3 = 0
            Since the inner product is zero, v1v_1 and v2v_2 are orthogonal. Repeat this process for all pairs in the set to confirm orthogonality.

            Step 3: Forming a Basis

            If a set of vectors is orthogonal and none of the vectors is the zero vector, then the set is linearly independent. Linear independence means that no vector in the set can be written as a linear combination of the others. Consequently, an orthogonal set of vectors in Rn \mathbb{R}^n forms a basis for a subspace of Rn \mathbb{R}^n . This basis is called an orthogonal basis. For example, if vectors v1,v2,,vnv_1, v_2, \ldots, v_n are orthogonal and linearly independent, they form a basis for the subspace they span.

            Step 4: Calculating Weights Using the Formula

            Once you have an orthogonal basis, you can express any vector in the subspace as a linear combination of the basis vectors. The coefficients (weights) of this linear combination can be calculated using a simple formula. For a vector yy and an orthogonal basis {v1,v2,,vn}\{v_1, v_2, \ldots, v_n\}, the weight cic_i for the basis vector viv_i is given by:

            • ci=yvivivic_i = \frac{y \cdot v_i}{v_i \cdot v_i}
            This formula leverages the orthogonality of the basis vectors to simplify the calculation of the weights. For example, if you want to find the weight c1c_1 for the vector v1v_1, you compute the inner product of yy and v1v_1, and then divide by the inner product of v1v_1 with itself.

            Step 5: Example Calculation

            Let's consider an example where you have an orthogonal basis {v1,v2}\{v_1, v_2\} in R2 \mathbb{R}^2 and you want to express a vector y=(1,2)y = (1, 2) as a linear combination of v1v_1 and v2v_2. Suppose v1=(4,2)v_1 = (4, 2) and v2=(1,2)v_2 = (-1, 2). To find the weights c1c_1 and c2c_2:

            • Calculate c1=yv1v1v1=14+2244+22=4+416+4=820=25c_1 = \frac{y \cdot v_1}{v_1 \cdot v_1} = \frac{1 \cdot 4 + 2 \cdot 2}{4 \cdot 4 + 2 \cdot 2} = \frac{4 + 4}{16 + 4} = \frac{8}{20} = \frac{2}{5}
            • Calculate c2=yv2v2v2=1(1)+22(1)(1)+22=1+41+4=35c_2 = \frac{y \cdot v_2}{v_2 \cdot v_2} = \frac{1 \cdot (-1) + 2 \cdot 2}{(-1) \cdot (-1) + 2 \cdot 2} = \frac{-1 + 4}{1 + 4} = \frac{3}{5}
            Therefore, yy can be expressed as y=25v1+35v2y = \frac{2}{5}v_1 + \frac{3}{5}v_2.

            Step 6: Summary

            In summary, orthogonal sets are collections of vectors where each pair is orthogonal. These sets are linearly independent and form a basis for a subspace. The weights for expressing any vector in the subspace as a linear combination of the basis vectors can be easily calculated using the inner product formula. This method simplifies many calculations in linear algebra and is particularly useful in various applications.

            FAQs

            Here are some frequently asked questions about orthogonal sets:

            1. What is an orthogonal set?

            An orthogonal set is a collection of vectors where each vector is perpendicular to every other vector in the set. Mathematically, this means that the dot product of any two distinct vectors in the set is zero.

            2. How do you determine if vectors are orthogonal?

            To determine if vectors are orthogonal, calculate their dot product. If the dot product is zero, the vectors are orthogonal. For example, if u = (a, b) and v = (c, d), they are orthogonal if ac + bd = 0.

            3. What is the difference between orthogonal and orthonormal sets?

            An orthogonal set consists of vectors that are perpendicular to each other. An orthonormal set is an orthogonal set where all vectors are also unit vectors (have a magnitude of 1). Orthonormal sets simplify many calculations in linear algebra.

            4. What are some applications of orthogonal sets?

            Orthogonal sets have numerous applications, including signal processing (e.g., Fourier transforms), quantum mechanics (describing quantum states), computer graphics (3D modeling and transformations), and data analysis (Principal Component Analysis).

            5. How do you find an orthogonal vector to a given vector?

            To find an orthogonal vector to a given vector, you can use the property that their dot product must be zero. For example, if given vector v = (a, b), an orthogonal vector u = (x, y) must satisfy ax + by = 0. You can choose one component and solve for the other.

            Prerequisite Topics

            Understanding the foundation of advanced mathematical concepts is crucial for mastering complex topics like orthogonal sets. One of the most important prerequisite topics for grasping orthogonal sets is linear independence. This fundamental concept in linear algebra plays a pivotal role in comprehending the nature and properties of orthogonal sets.

            Linear independence is essential because it forms the basis for understanding vector spaces and their properties. When studying orthogonal sets, students must have a solid grasp of how vectors can be linearly independent or dependent. This knowledge directly translates to the concept of orthogonality, where vectors are perpendicular to each other and, by definition, linearly independent.

            The relationship between linear independence and orthogonal sets becomes evident when considering the properties of orthogonal vectors. In an orthogonal set, each vector is perpendicular to every other vector in the set. This perpendicularity ensures that no vector in the set can be expressed as a linear combination of the others, which is the very definition of linear independence.

            Moreover, understanding linear independence helps students appreciate the power and utility of orthogonal sets in various mathematical and practical applications. For instance, in signal processing and data analysis, orthogonal sets of functions or vectors are used to represent complex signals or data in a more manageable form. The concept of linear independence is crucial in ensuring that these representations are unique and efficient.

            Students who have mastered linear independence will find it much easier to grasp the properties of orthogonal sets, such as the fact that any subset of an orthogonal set is also orthogonal. This property stems directly from the linear independence of the vectors in the set.

            Furthermore, the study of linear independence prepares students for more advanced topics related to orthogonal sets, such as orthogonal projections, Gram-Schmidt orthogonalization process, and orthogonal matrices. These concepts build upon the foundation of linear independence and extend it to create powerful tools in linear algebra and its applications.

            In conclusion, a thorough understanding of linear independence is indispensable for students approaching the study of orthogonal sets. It provides the necessary conceptual framework to appreciate the unique properties of orthogonal vectors and sets, and it paves the way for exploring more advanced topics in linear algebra and related fields. By mastering this prerequisite topic, students will be well-equipped to tackle the challenges and applications of orthogonal sets in their mathematical journey.

            A set of vectors {v1,,vnv_1,\cdots,v_n} in Rn\Bbb{R}^n are orthogonal sets if each pair of vectors from the set are orthogonal. In other words,
            vivj=0v_i \cdot v_j =0
            Where iji \neq j.

            If the set of vectors {v1,,vnv_1,\cdots,v_n} in Rn\Bbb{R}^n is an orthogonal set, then the vectors are linearly independent. Thus, the vectors form a basis for a subspace SS. We call this the orthogonal basis.

            To check if a set is an orthogonal basis in Rn\Bbb{R}^n, simply verify if it is an orthogonal set.
            y=c1v1+c2v2++cpvpy=c_1 v_1+c_2 v_2+\cdots+c_p v_p

            Are calculated by using the formula:
            ci=yvivivic_i = \frac{y \cdot v_i}{v_i \cdot v_i}
            where i=1,,pi=1,\cdots,p.

            A set {v1,,vpv_1,\cdots,v_p}is an orthonormal set if it's an orthogonal set of unit vectors.

            If SS is a subspace spanned by this set, then we say that {v1,,vpv_1,\cdots,v_p} is an orthonormal basis. This is because each of the vectors are already linear independent.

            A m×nm \times n matrix UU has orthonormal columns if and only if UTU=IU^T U=I.

            Let UU be an m×nm \times n matrix with orthonormal columns, and let xx and yy be in Rn\Bbb{R}^n. Then the 3 following things are true:
            1) Ux=x\lVert Ux \rVert = \lVert x \rVert
            2) (Ux)(Uy)=xy (Ux) \cdot (Uy)=x \cdot y
            3) (Ux)(Uy)=0(Ux) \cdot (Uy)=0 if and only if xy=0x \cdot y =0

            Consider LL to be the subspace spanned by the vector vv . Then the orthogonal projection of yy onto vv is calculated to be:
            y^=\hat{y}=projLy=yvvvv_Ly=\frac{y \cdot v}{v \cdot v}v

            The component of yy orthogonal to vv (denoted as zz) would be:
            z=yy^z=y-\hat{y}