Orthogonal Eigenvector, Proof is bothering me

In summary: The transpose, if it is needed at all, is on the A. So, you don't need to worry about whether \vec{y} is a row or column vector, the transpose is on the matrix, not on the vector.In summary, the dot product of two vectors can be rewritten as a matrix multiplication, and in this case, the transpose of the matrix is on A, not the vector. Therefore, the assumption that \vec{y} is equal to its transpose is incorrect.
  • #1
rocomath
1,755
1
Suppose

[tex]A\overrightarrow{x}=\lambda_1\overrightarrow{x}[/tex]
[tex]A\overrightarrow{y}=\lambda_2\overrightarrow{y}[/tex]
[tex]A=A^T[/tex]

Take dot products of the first equation with [tex]\overrightarrow{y}[/tex] and second with [tex]\overrightarrow{x}[/tex]

ME 1) [tex](A\overrightarrow{x})\cdot \overrightarrow{y}=(\lambda_1\overrightarrow{x})\cdot\overrightarrow{y}[/tex]

BOOK ... skipped steps but only shows this 1) [tex](\lambda_1\overrightarrow{x})^T\overrightarrow{y}=(A\overrightarrow{x})^T\overrightarrow{y}=\overrightarrow{x}^TA^T\overrightarrow{y}=\overrightarrow{x}^TA\overrightarrow{y}=\overrightarrow{x}^T\lambda_2\overrightarrow{y}[/tex]

Now it looks like I have to transpose my first step, but if I do so, do I assume that [tex]y=y^T[/tex]?
 
Last edited:
Physics news on Phys.org
  • #2
Note that the dot product of two vectors is just a matrix multiplication. E.g. if we take [itex]\vec x, \vec y[/itex] to be (n x 1) column vectors then the dot product is just
[tex]\vec x \cdot \vec y = \vec x^T \vec y[/tex]
where the right hand side is just matrix multiplication of an (1 x n) with an (n x 1) matrix. Then the solution easily follows. Try to write it out and indicate what is being done in each step.
 
  • #3
rocomath said:
Suppose

[tex]A\overrightarrow{x}=\lambda_1\overrightarrow{x}[/tex]
[tex]A\overrightarrow{y}=\lambda_2\overrightarrow{y}[/tex]
[tex]A=A^T[/tex]

Take dot products of the first equation with [tex]\overrightarrow{y}[/tex] and second with [tex]\overrightarrow{x}[/tex]

ME 1) [tex](A\overrightarrow{x})\cdot \overrightarrow{y}=(\lambda_1\overrightarrow{x})\cdot\overrightarrow{y}[/tex]

BOOK ... skipped steps but only shows this 1) [tex](\lambda_1\overrightarrow{x})^T\overrightarrow{y}=(A\overrightarrow{x})^T\overrightarrow{y}=\overrightarrow{x}^TA^T\overrightarrow{y}=\overrightarrow{x}^TA\overrightarrow{y}=\overrightarrow{x}^T\lambda_2\overrightarrow{y}[/tex]

Now it looks like I have to transpose my first step, but if I do so, do I assume that [tex]y=y^T[/tex]?
You don't, they are not the same.

What you are doing is thinking of the dot product [itex]\vec{u}\cdot\vec{v}[/itex] as the matrix product [itex]\vec{u}^T\vec{v}[/itex] where [itex]\vec{u}[/itex] and [itex]\vec{v}[/itex] are "column" matrices and [itex]\vec{u}^T[/itex] is the "row" matrix corresponding to [itex]\vec{v}[/itex]. It Then follows that [itex](A\vec{x}\cdot\vec{y}= (A\vec{x})^T\vec{y}= \vec{x}^TA^T\vec{y}[/itex] which, because AT= A is the same as [itex]\vec{x}^TA\vec{y}= \vec{x}\cdot A\vec{y}[/itex]. It is not that [itex]\vec{y}^T= \vec{y}[/itex], you never have "[itex]\vec{y}^T[/itex].
 

Related to Orthogonal Eigenvector, Proof is bothering me

1. What is an orthogonal eigenvector?

An orthogonal eigenvector is a vector that is perpendicular (or orthogonal) to all other vectors in a given space. In linear algebra, eigenvectors are special vectors that represent the directions in which a linear transformation only scales the vector, without changing its direction. An orthogonal eigenvector is a specific type of eigenvector that is perpendicular to all other eigenvectors in a given space.

2. How is an orthogonal eigenvector determined?

An orthogonal eigenvector can be determined by finding the eigenvalues and eigenvectors of a matrix using various methods such as Gaussian elimination, power iteration, or Jacobi iteration. Once the eigenvalues and eigenvectors are found, the orthogonality of the eigenvectors can be checked by taking the dot product of any two eigenvectors, which should equal zero if they are orthogonal.

3. Why is the proof for orthogonal eigenvectors important?

The proof for orthogonal eigenvectors is important because it allows us to understand and make use of the properties and applications of these special vectors in linear algebra. The proof also provides a deeper understanding of the relationship between eigenvectors and eigenvalues, and how they can be used to simplify and solve complex problems in mathematics and science.

4. What are the key steps in proving that eigenvectors are orthogonal?

The key steps in proving that eigenvectors are orthogonal include showing that the dot product of any two eigenvectors is equal to zero, using the definition of orthogonality and the properties of eigenvalues and eigenvectors. Additionally, it involves using algebraic manipulations, such as matrix multiplication and transposition, to simplify the proof and arrive at the desired result.

5. How is the proof for orthogonal eigenvectors applied in real-world scenarios?

The proof for orthogonal eigenvectors has various applications in real-world scenarios. For example, it is used in image processing and computer vision to decompose images into their constituent parts and analyze them. It is also used in physics and engineering, such as in quantum mechanics and structural analysis, to understand and manipulate complex systems and data. Furthermore, the proof is also applied in data analysis and machine learning, where eigenvectors are used to reduce the dimensionality of large datasets and identify important features for classification and prediction tasks.

Similar threads

  • Special and General Relativity
Replies
14
Views
814
  • Calculus and Beyond Homework Help
Replies
1
Views
1K
  • Calculus and Beyond Homework Help
Replies
7
Views
3K
  • Calculus and Beyond Homework Help
Replies
3
Views
1K
  • Calculus and Beyond Homework Help
Replies
7
Views
3K
  • Calculus and Beyond Homework Help
Replies
6
Views
2K
  • Atomic and Condensed Matter
Replies
2
Views
2K
Replies
3
Views
335
  • Calculus and Beyond Homework Help
Replies
9
Views
1K
Replies
1
Views
864
Back
Top