What is the purpose of introducing the transpose in this proof?

  • Thread starter Thread starter georg gill
  • Start date Start date
  • Tags Tags
    Proof rank
georg gill
Messages
151
Reaction score
6
http://www.viewdocsonline.com/document/8uu6tm

You can zoom in on the proof by tabs down to the left.

I have understood all of the proof until the last part where they introduce the transpose of A after they have proved that

row rank of A is equal or less than column rank of A. Why do they use the transpose?
 
Physics news on Phys.org
georg gill said:
http://www.viewdocsonline.com/document/8uu6tm

You can zoom in on the proof by tabs down to the left.

I have understood all of the proof until the last part where they introduce the transpose of A after they have proved that

row rank of A is equal or less than column rank of A. Why do they use the transpose?

Hey georg gill.

The transpose basically turns all the rows into columns of a matrix. So basically row 1 becomes column 1, row 2 becomes column 2 and so on.

This is probably why they do the proof for the rows first because once you proven for the rows, then the transpose relates those same properties to the columns when you take the transpose.
 
f:V-->W, assume f is onto. then consider, f*:W*-->V*. claim it is injective.

(recall V* = linear maps V-->R.)

this injectivity is trivial, since distinct functions on W are distinguished at some pair of points of W,

and these are images of some pair of distinct points of V by surjectivity of f.

hence the rank of f equals the rank of f*. QED,

i.e. row rank = column rank.If you don't understand this argument, you will benefit from spending some time understandingwhat V* means and why the transpose of the matrix of V-->W,

is the matrix of the map W*-->V*.
 
mathwonk said:
f:V-->W, assume f is onto. then consider, f*:W*-->V*. claim it is injective.

(recall V* = linear maps V-->R.)

this injectivity is trivial, since distinct functions on W are distinguished at some pair of points of W,

and these are images of some pair of distinct points of V by surjectivity of f.

hence the rank of f equals the rank of f*. QED,

i.e. row rank = column rank.


If you don't understand this argument, you will benefit from spending some time understanding


what V* means and why the transpose of the matrix of V-->W,

is the matrix of the map W*-->V*.

i think it might be helpful to actually say what f* is, as well.

i assume you mean f*(φ) = φ○f.

for functions, in general (and thus linear transformations, in particular):

φ1○f = φ2○f → φ1 = φ2

when f is surjective (onto), if one allows that the axiom of choice is true

(i assume you chose to prove the contra-positive to avoid this subtlety,

but i believe that "exhibiting the w" for which this is true (for two arbitrary

distinct functions) amounts to the same thing).

i am not convinced that 2 distinct functions must disagree at two points,

it seems to me they might just disagree at only one, and really, one

point (of V) of disagreement is all you need to show that φ1○f, φ2○f are distinct.
 
you are right. then the argument works even easier. good for you. i was probably thinking visually and trying to translate into words. probably the two "points" i visualized were the two distinct functions, as points in function space. or maybe i was just confused. anyway, giving the argument correctly as you did, it follows that, at least for a surjective map V-->W. the column rank, i.e. the dimension of the image, equals the dimension of the image of the induced map (composition, as you said), W*-->V*. this is the column rank of the transpose of the original matrix, which equals the row rank of the original matrix.

now to see you can reduce to the surjective case, if I is the image of the map V-->W then the map factors as V-->I-->W, so that the dual map also factors as W*-->I*-->V*.Since I is the image of V-->W, the map V-->I

has the same rank as V-->W, and since V-->I is surjective it also has the same rank as I*-->V*,

by the special case we just did.

Since every linear map defined on I extends to W, (just enlarge a basis for I to a basis for W),

the restriction map W*-->I*, is surjective.then the rank of the composition W*-->V* equals the rank of the injective map I*-->V*,

i.e. it equals dim I = rank V-->W.Altogether, column rank = rank V-->W = rank W*-->V* = row rank.
 
Last edited:
It seems to me these arguments are only two steps away general abstract nonsense and even though they are very fine proofs because of that and als apply to more general functions it also seems to me the OP was about a linear. Thus using linearity and the fact that you can then use matrix operation and such is possible the more abstract proof might just obscure the facts.
 
##\textbf{Exercise 10}:## I came across the following solution online: Questions: 1. When the author states in "that ring (not sure if he is referring to ##R## or ##R/\mathfrak{p}##, but I am guessing the later) ##x_n x_{n+1}=0## for all odd $n$ and ##x_{n+1}## is invertible, so that ##x_n=0##" 2. How does ##x_nx_{n+1}=0## implies that ##x_{n+1}## is invertible and ##x_n=0##. I mean if the quotient ring ##R/\mathfrak{p}## is an integral domain, and ##x_{n+1}## is invertible then...
The following are taken from the two sources, 1) from this online page and the book An Introduction to Module Theory by: Ibrahim Assem, Flavio U. Coelho. In the Abelian Categories chapter in the module theory text on page 157, right after presenting IV.2.21 Definition, the authors states "Image and coimage may or may not exist, but if they do, then they are unique up to isomorphism (because so are kernels and cokernels). Also in the reference url page above, the authors present two...
When decomposing a representation ##\rho## of a finite group ##G## into irreducible representations, we can find the number of times the representation contains a particular irrep ##\rho_0## through the character inner product $$ \langle \chi, \chi_0\rangle = \frac{1}{|G|} \sum_{g\in G} \chi(g) \chi_0(g)^*$$ where ##\chi## and ##\chi_0## are the characters of ##\rho## and ##\rho_0##, respectively. Since all group elements in the same conjugacy class have the same characters, this may be...
Back
Top