Rang of matrix

Main Question or Discussion Point

Hi! I want to ask you something. I have one part of my book which I can't understand.

my text book said:
Because of symmetry, if we look at the transpose of the matrix [tex]A^t[/tex] we receive that the row rang is lower or equal of the colonial rang.
I can't understand the bolded part. Please help! Thanks.
P.S My teacher told me if we start to prove this, we will get that the row rang is lower or equal of the colonial rang.
I think it is always equal, how is this possible?
 

Answers and Replies

1,395
0
ok if i understood you correctly
the row rank AKA the image AKA Im(A)
is the number of rows which remains after we made row reduction
on the transposed matrix
the number of the lines which are all zeros AKA Ker(A)
there is a fundamental law which states dim(Im(A))+dim(Ker(A))=dim A
so as you see from the equation the dimention of the Im is always smaller
or equals to the DIM A
 
5
0
Independent vectors by matrix method

I can not understand how to prove that vectors are lineary independant by using the method of matrix ><>< Please I wanna know an easy way to make matrix method easy to understand ><<
U can send me e-mail at >>>>> [COLOR="Blue"]ahmedtomyusa@yahoo.com[/COLOR]
 
ok if i understood you correctly
the row rank AKA the image AKA Im(A)
is the number of rows which remains after we made row reduction
on the transposed matrix
the number of the lines which are all zeros AKA Ker(A)
there is a fundamental law which states dim(Im(A))+dim(Ker(A))=dim A
so as you see from the equation the dimention of the Im is always smaller
or equals to the DIM A
Wait, can you explain much further, maybe with example?? Thank you.
 
191
0
I can not understand how to prove that vectors are lineary independant by using the method of matrix ><>< Please I wanna know an easy way to make matrix method easy to understand ><<
U can send me e-mail at >>>>> [COLOR="Blue"]ahmedtomyusa@yahoo.com[/COLOR]
Hint: Just like manipulating a system of linear equations, try using Gaussian elimination and try to think about what happens to a matrix if it is NOT linear independent (what will you observe about the rows?).
 
Last edited:
1,395
0
i'll to explain that more in details

for example you are given a matrix
100
010
001
its dim Im=3 (aka rowrank=3)
its dim Ker=0
because all of the vectors are independant
so in this case the the rowrank equals to the total dim
and if we have
100
010
000
then (dim Im=2)
dim ker=1
so the dim Im aka the row rank is smaller then the total dim V
it cannot be larger then the total dim V

i am sure whats the row rank i think its the dim Im
i am not femiliar with this terms so good
halls of ivey can you correct me if i got this row rank terminology correctly ??
 
191
0
Well all you have to remember from the rank-nullity theorem is that when you add the rank(A) + null(A) you get the dimension of A.

In order to computer the row rank all you have to do is use Gaussian elimination and locate the rows that have pivots. If you have any linearly dependent vectors, then those will in-turn give you rows of zeros and that counts as a null space. Therefore, your rank is just essentially the number of rows with actual pivots.

What does this mean? Remember that your range is basically the vectors that lead to a solution. So these are just the images of the stuff being mapped from the vector space. If it's an image and it's not being mapped to zero, then it's in the range (zero is trivial). However, the dimension doesn't have to be equal to the range. There can be stuff in the first vector space that maps only to zero and this is basically the null space. That's why the number of images is either equal or LESS than the dimension of the vector space. If it's not becoming something at the other side of the linear-map, then it must be sitting in the null space.

You see that in matrix form. You can have a matrix, that when reduced, have pivots. If all the rows have it's own pivot, then we find (through plugging in since the first variable will have a readily available solution) that we can just back-substitute and get a solution for all the variables. Therefore the pivots signify which variables will get solutions. This need not apply for all matrices. You may have a matrix that when reduced, yields to a row of all zeros. Thus you will have less pivots than dimension (aka. rank (A) < dim (A) )
 
Last edited:
why on the first one dimKer=0, how did u find it?
 
HallsofIvy
Science Advisor
Homework Helper
41,732
893
Are you referring to
transgalactic said:
for example you are given a matrix
100
010
001
its dim Im=3 (aka rowrank=3)
its dim Ker=0
In this case, it should be obvious. That is the identity matrix that maps vector v into itself. The only vector that is mapped into the 0 vector is the 0 vector itself. The set {0" is a "subspace" of dimension 0.
More generally, transgalactic is saying that you can determine the nullity (dimension of the kernel) of a matrix by row reducing it. If it reduces to something that has only non-zeroes on the diagonal, then it is invertible and "one-to-one". It cannot map a non-zero vector into 0 so its nullity is 0. The nullity is the number of "all 0" rows after row-reducing.

(By the way, it is "rank", not "rang" and "column" not "colonial"!)
 
:D:D:D. Those mistranslations, are very funny, aren't they? Ok, what is it mapping from

[tex]\mathbb{R}^3 \rightarrow \mathbb{R}[/tex], or what?
 
1,395
0
when you have R^3 >> R
then your matrix is the opposite it 1X3 where the left side represents the rows
and the left side represent the columns
it says the number of rows and columns in your matrix

regarding the question of Physicsissuef:

generaly the number of all zero rows represents the Dim( ker A)
but the more general way to find it is by solving
A*(x,y,z)=(0,0,0) basicly what it says
is just equalize the equations of A to zero
and find the values of x y and z parameters

we are not intrested in the actual numbers
but in the number of solutions we got
because it represents the Dim( ker A)
 
Last edited:
Can we say with simpler words: if we have 3 vectors, we suppose that they are all independent and have dimension 3. But later we find out that actually they are 2. So when we first look at the matrix, we say that the dimension of the rows is less or equal to the rank, right?
 
1,395
0
i agree with you
 
191
0
Can we say with simpler words: if we have 3 vectors, we suppose that they are all independent and have dimension 3. But later we find out that actually they are 2. So when we first look at the matrix, we say that the dimension of the rows is less or equal to the rank, right?
The row rank is number of linearly indepedent row vectors. If you have 3 vectors that are all independent, and the dimension of your space is 3, then the rank equals the dimension of the space. Your rank is lower than the dimension of the space if you have row vectors that are not ALL linearly independent. Again, through row reduction you will see this. You will end up with a rank lower than the dimension of the space if one or more of the vectors you have in row form are linearly dependent. These rows will turn into all zeros.
 
And is the dimension of the space Im3 + ker0=3 I still cannot understand why ker=0
Do we have R^3 --> R^3 ??
 
1,395
0
R^3 >> R^3 says that you got 3X3 matrix

ker A=0 in case where after performing a row reduction on your matrix
you dont dave any all zero vectors
 
and when the kernel=1 or 2 or 3?
 
1,395
0
when the kernel is one the rank is 2
dim Im A+dim ker A= dim A
 
when the kernel is one the rank is 2
dim Im A+dim ker A= dim A
How will I know it, can you give me some example, please? And tell me how to calculate?
 
191
0
And is the dimension of the space Im3 + ker0=3 I still cannot understand why ker=0
Do we have R^3 --> R^3 ??
To calculate, take the matrix and perform Gaussian elimination to row reduce it.

Here's an example of a matrix with linearly independent rows. (It's already row-reduced to the simplest form, but you can have a matrix that isn't this simple... which is why you should reduce it yourself first)

1 0 0
0 1 0
0 0 1

The dimension here is 3, and the rank is also 3 because you have 3 linearly independent rows. Thus using the theorem, rankA + nullA = dimA, you get 3 + 0 = 3 which checks out.

However, you may be wondering, why is nullA = 0?

Well, let's look at a separate situation. Let the matrix be one that has a row of vectors that is linearly dependent.

1 0 0
0 1 0
1 0 0

This will reduce to:

1 0 0
0 1 0
0 0 0

Therefore, count the pivots (if you don't know what pivots are, then re-read about row reduction or Gaussian elimination).

You have two pivots here signifying you have TWO linearly independent rows. Therefore the rank here is 2. However, you're still dealing with R^3, the only difference is that one of the rows here is all zero. Thus if you know the rank is 2 and the dimension of the space is 3, what do you think that matrix tells you about the null space dimension (you can observe the matrix, or just use the equation)?
 
Last edited:
BryanP said:
Thus if you know the rank is 2 and the dimension of the space is 3, what do you think that matrix tells you about the null space dimension (you can observe the matrix, or just use the equation)?
I don't know what do you mean. can you please explain?
 
191
0
I don't know what do you mean. can you please explain?
In the example where we have

1 0 0
0 1 0
0 0 0

We have a rank of two because we find two of the rows are linearly independent. The third row isn't because it consists of all zeroes.

We're dealing with 3 dimensions here so by the rank-nullity theorem:

rank(A) + null(A) = dim(A)

the rank is 2 and we know the dimension is 3, so:

2 + null(A) = 3
null(A) = 1

Or you can see that in the matrix since you have a row of zeros...
 
so ker(A) is the number of "zero" rows right?
 
1,395
0
yes

do you know how to perform a row reduction??
 
HallsofIvy
Science Advisor
Homework Helper
41,732
893
A point about terminology: No, the "kernel" is not "the number or zero rows" nor it is "1 or 2". The dimension of the kernel (also called the nullity of the linear transformation) is "the number of zero rows" in the row reduction of the matrix. The kernel itself is a subspace of the domain space of the linear transformation.
 

Related Threads for: Rang of matrix

Replies
5
Views
2K
  • Last Post
Replies
2
Views
1K
  • Last Post
Replies
3
Views
4K
Replies
1
Views
1K
Replies
5
Views
6K
  • Last Post
Replies
6
Views
11K
Replies
3
Views
2K
Top