I How can I convince myself that I can find the inverse of this matrix?

  • #31
Mark44 said:
The first is a row vector. The second is a column vector, which is the transpose of the row vector. IOW, the column vector is ##x^T##, using the usual notation.
swampwiz said:
I thought it was the other way around.
Not sure what you're considering to be the other way around. My reply to @Hall, which is quoted above, was a response to his asking what is the difference between ##[x_1, x_2, \dots, x_n]## and ##\begin{bmatrix}x_1 \\ x_2 \\ \dots \\ x_n \end{bmatrix}##.
Rows are horizontal and columns (like the columns of a building are vertical, so the first vector above is a row vector, and the second vector is a column vector.

If your confusion is with the notation ##x^T##, a transpose can be either a row vector or a column vector, depending on how ##x## is originally defined.
 
Physics news on Phys.org
  • #32
Mark44 said:
Not sure what you're considering to be the other way around. My reply to @Hall, which is quoted above, was a response to his asking what is the difference between ##[x_1, x_2, \dots, x_n]## and ##\begin{bmatrix}x_1 \\ x_2 \\ \dots \\ x_n \end{bmatrix}##.
Rows are horizontal and columns (like the columns of a building are vertical, so the first vector above is a row vector, and the second vector is a column vector.

If your confusion is with the notation ##x^T##, a transpose can be either a row vector or a column vector, depending on how ##x## is originally defined.
What I was saying was the I thought the nominal form of a vector is as a column, not as a row. Certainly if written as A x, x is a column vector.
 
  • #33
swampwiz said:
What I was saying was the I thought the nominal form of a vector is as a column, not as a row. Certainly if written as A x, x is a column vector.
I don't think there is a nominal form of a vector. However, in the context of the expression Ax, with A being a matrix, x would have to be a column vector.
 
  • #34
More generally, given an ##n \times n## matrix, the vector must be ##n \times 1## for the product to be defined. So, yes, a column vector.
 
  • #35
swampwiz said:
What about the eigenproblem equation? A x = 0, but the eigenvectors are solutions of x that are NOT 0; of course, this is possible because the determinant of A must be 0 for this to work.
The eigenproblem EQ is [ A ]{ x } = λ { x }, which leads to [ [ A ] - λ [ I ] ] { x } = { 0 }, and only works for the case of Δ( [ [ A ] - λ [ I ] ] ) = 0. Eigenvectors correspond to the nullspace of a matrix.
 

Similar threads

  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
Replies
33
Views
1K
  • · Replies 14 ·
Replies
14
Views
3K
  • · Replies 1 ·
Replies
1
Views
3K
Replies
7
Views
2K
  • · Replies 10 ·
Replies
10
Views
3K
Replies
4
Views
4K
  • · Replies 10 ·
Replies
10
Views
2K