MHB Which one of these two statements is wrong ?

  • Thread starter Thread starter Yankel
  • Start date Start date
Click For Summary
The discussion revolves around two statements regarding linear algebra: the first asserts that every set of n linearly independent vectors forms a basis for a vector space of dimension n, while the second claims that an invertible matrix is necessarily diagonalizable. The first statement is debated because, while n linearly independent vectors can span a vector space of dimension n, the requirement for spanning is emphasized. The second statement is challenged with a counterexample of a non-diagonalizable invertible matrix, specifically D = [[1, 1], [0, 1]], which has a non-zero determinant but is not diagonalizable. The conversation highlights the need for clarity on the relationship between linear independence, spanning, and the properties of matrices in linear algebra.
Yankel
Messages
390
Reaction score
0
a. every set of n linearly independent vectors are basis of a vector space with dimension n

b. An invertible matrix is necessarily diagonalizable

they both seems wrong to me, but only one suppose to be.

'a' sounds wrong because the n vectors must also span the vector space

and 'b' because I don't see the relation between invertible and diagonalizable
 
Physics news on Phys.org
Yankel said:
'a' sounds wrong because the n vectors must also span the vector space
Which vector space?
 
Yankel said:
a. every set of n linearly independent vectors are basis of a vector space with dimension n

b. An invertible matrix is necessarily diagonalizable

they both seems wrong to me, but only one suppose to be.

'a' sounds wrong because the n vectors must also span the vector space

and 'b' because I don't see the relation between invertible and diagonalizable

n linearly independent vectors span a vector space with n dimensions.
Since the vector space has only n dimensions, those vectors necessarily must span that vector space.

For 'b' you should try to find a counter example.
Can you think of a matrix that is not diagonalizable, but still has a non-zero determinant?
 
to extend serena statement.

If D is a diagnizable matrix then there exists an invertible nxn matrix P
such that D = $P^{-1}AP$ where A is the matrix with the eigen values of D on its diagonals.

Now take D = $\begin{bmatrix} 1 & 1 \\ 0 & 1 \end{bmatrix}$. whose eigen values are 1,1. Now if D is diagnoziable there would exist a nxn invertible matrix P such that

D = $P^{-1}\begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix}P$, now 4 $\begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix}$ is just the identity matrix.
so D = $P^{-1}P = I$, since $D \not = I$, it is clear that D is not diagnizable.

Yet D is invertible Det(D) = 1.
 
Last edited:
thanks everyone for your answers, it's very helpful.

I still don't fully understand why when the dimension is n the vector must also span.

I will try to look for some further material on it

thanks
 
I am studying the mathematical formalism behind non-commutative geometry approach to quantum gravity. I was reading about Hopf algebras and their Drinfeld twist with a specific example of the Moyal-Weyl twist defined as F=exp(-iλ/2θ^(μν)∂_μ⊗∂_ν) where λ is a constant parametar and θ antisymmetric constant tensor. {∂_μ} is the basis of the tangent vector space over the underlying spacetime Now, from my understanding the enveloping algebra which appears in the definition of the Hopf algebra...

Similar threads

  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 10 ·
Replies
10
Views
1K
  • · Replies 33 ·
2
Replies
33
Views
2K
  • · Replies 9 ·
Replies
9
Views
4K
  • · Replies 9 ·
Replies
9
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K