MHB Which of the Following is Incorrect Regarding Matrices and Vectors?

  • Thread starter Thread starter Yankel
  • Start date Start date
  • Tags Tags
    Matrices Vectors
Click For Summary
The discussion revolves around identifying the incorrect statement regarding matrices and vectors. The statement that "If A is a squared matrix for which A² - A = 0, then A = 0 or A = i" is highlighted as incorrect, as it is possible for non-zero matrices to satisfy this equation. The correctness of diagonal matrices commuting (statement b) is questioned but appears valid in examples. The diagonalizability of a 4x4 matrix with distinct eigenvalues (statement c) is affirmed, while the polynomial space dimension (statement d) is confirmed to be 4. The discussion emphasizes the nuances of linear dependence in vectors (statement e) and the potential for non-zero matrices to yield a product of zero.
Yankel
Messages
390
Reaction score
0
One last question on these topics, I need to choose the WRONG statement, and they all seem correct to me...

a) If A is a squared matrix for which
\[A^{2}-A=0\]

then A=0 or A=i

b) If A and B are diagonal matrices, then Ab=BA

c) A 4X4 matrix with eigenvalues 1,0,-1,2 is "diagonlizable"

d) The dimension of the polynomials space of order 3 (ax^3+bx^2+...) is 4

e) If two vectors are linearly dependent, then one is necessarily a multiplication of the other

'a' is correct
'b', not sure, I tried one example, it worked
'c' Each eigenvalue appears once, so it's not possible to have an eigenvalue which appears twice with corresponding 1 eigenvector (for example)
'd' Isn't it 4 ?
'e' I think so...

will appreciate your help
 
Physics news on Phys.org
Concerning answer a): consider
$$A=\begin{bmatrix} 1 &1 \\ 0 &0 \end{bmatrix}.$$
 
To underscore a common mistake:

Just because:

$A^2 - A = A(A - I) = 0$

There is NO REASON to believe $A = 0$ or $A - I = 0$.

It is VERY POSSIBLE to have matrices $A,B$ with $AB = 0$ but $A,B \neq 0$.

Any such matrix, of course, is singular.
 
Thread 'How to define a vector field?'
Hello! In one book I saw that function ##V## of 3 variables ##V_x, V_y, V_z## (vector field in 3D) can be decomposed in a Taylor series without higher-order terms (partial derivative of second power and higher) at point ##(0,0,0)## such way: I think so: higher-order terms can be neglected because partial derivative of second power and higher are equal to 0. Is this true? And how to define vector field correctly for this case? (In the book I found nothing and my attempt was wrong...

Similar threads

  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 33 ·
2
Replies
33
Views
1K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 14 ·
Replies
14
Views
4K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 24 ·
Replies
24
Views
2K
  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 16 ·
Replies
16
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K