Some questions about reducible matrices and operators

In summary, the conversation discusses the equality of nullity between A and PAP (inverse) and the confusion around the size and dimension of the vector spaces involved. The main point of confusion is how to define the same linear transformation in two vector spaces of different dimensions and how to prove that only the zeros of W go to zeros in R2. The conversation also touches on the concept of invertible matrices and the transformation of linear operators from one basis to another.
  • #1
26
0
Hello ,

This is regarding the equality of nullity between A and PAP(inverse).

If my understanding is correct then the thing should be according to the diagram below


V---------------->R1(isomorphic to V)
| |
| |
\/ \/
W----------------->R2(isomorphic to W)


So A : V->W
P: V->R1
P: W->R2

This second thing I am not convinced since the size of W is less than V because of the nullity in this case is assumed to be greater than 1.

If my understanding is correct what one does is that sincce R1 is isomorphic to V so

R1---->R2 is R1->V->W->R2. Which means take bases in R1 and apply P(inverse) which is correct since inverse is defined for the isomorphic transformation and then applying A u land up all the vectors from V which u had got from R1 to 0 in W. Then u apply P to all the vectors and claim that none of them go to 0 in R2 other than the 0s of W. What is confusing is that how do u define the same linear transformation in both the two vector spaces of two different dimensions in general and if possible how do u prove that only the 0s in W go to 0s in R2 and no body else.
 
Last edited:
Physics news on Phys.org
  • #2
How can P be both a map from V to R_1, and W to R_2? It can't and it isn't.

If P is invertible, then it is square, as well. So even if we assume A is a map from V to V, that is V is isomorphic to W, then it is not correct to assert "W has smaller size [dimension?] than V".


By equality of nullity do you mean the dimension of the kernel/null space? That is simple - show that they are isomorphic vector spaces. (P is invertible).
 
  • #3
Actually after I wrote down the query I happened to refer again to Kunze Huffman and found that this is a standard theorem regarding transformation of linear operator from one basis to another.

Then I realized that the point which was not clear was that if Tv is a vector in basis B then how could with respect to B' I could write P(Tv) where P is the matrix of transformation from B to B'. What is unclear is that when u are doing this u are actually trying to premultiply a vector which is already in the space W, but according to theorem 8 on pg 53 of the book it says that

Suppose I is an n x n invertible matrix over F. Let V be an n dimensional vector space over F and let B be an ordered basis of V . Then there exists unique ordered basis B' of V such that [alpha] in basis B= P[alpha ] in basis B'.

So how is it here the vector space in which the vector is going to reside and the basis are completely different. Am I missing something very obvious. My thinking is that probably even if the space W has smaller dimension than V it is extended by adding 0s to it to equate V and then trying to apply the above technique. Still I am highly confused of applying a matrix NxN which would transform V -> V on something in W.

Sorry but I do not know how to use the subscripts here for clarity but hopefully I have been able to make my doubt across.
 
Last edited:

Suggested for: Some questions about reducible matrices and operators

Replies
6
Views
596
Replies
1
Views
79
Replies
5
Views
1K
Replies
1
Views
620
Replies
3
Views
736
Replies
0
Views
457
Replies
9
Views
720
Back
Top