Prove the existence of row-reduced matrices with restrictions

  • Thread starter rockerman
  • Start date
  • #1
14
0
Let A = [a b; c d] a 2x2 matrix with complex entries. Suppose that A is row-reduced and also that a+b+c+d =0 . Prove that there are exactly three such matrices....



so i realize that there are seven possible 2x2 matrices that are row-reduced.
[1 0; 0 1], [0 1; 1 0], [0 0; 1 0], [0 0;0 1], [1 0; 0 0], [0 1; 0 0], [0 0; 0 0]...
and the only one that satisfies the restriction is the last. What am i missing? thx


This is the problem 6 of the section 1.3 of HOFFMAN and KUNZE - Linear Algebra.
 

Answers and Replies

  • #2
22,089
3,291
Hi rockerman! :smile:

How did you define a row-reduced matrix?

I ask this because I highly doubt that

1) [1 0; 0 1], [0 1; 1 0], [0 0; 1 0], [0 0;0 1], [1 0; 0 0], [0 1; 0 0], [0 0; 0 0] are the only row-reduced matrices. In fact, I think there are infinitely many of them.

2) I don't think [0 1; 1 0], [0 0; 1 0], [0 0;0 1] even are row-reduced.

But all of this depends on the definition you're using. So, what is it?
 
  • #3
14
0
so..

the row reduced definition provided by the book is given as follows:

An mxn matrix R is called row-reduced if:

a) the first non-zero entry in each non-zero row of R is equal to 1;
b) each column of R which contains the leading non zero entry of some row has all its other entries 0.

This way, i think that only exists seven row-reduced matrices in R^(2x2).. but i'm not figuring out how to prove the existence of three such matrices..
 
  • #4
22,089
3,291
What about matrices like

[tex]\left(\begin{array}{cc} 1 & a\\ 0 & 0 \end{array}\right)[/tex]

Isn't that row-reduced too?
 
  • #5
14
0
oh, how I've missed this...


so there are many row-reduced matrices like that... but how i will find the three matrices that satisfies the condition?
 
  • #6
22,089
3,291
Choose a special a...
 
  • #7
14
0
the a that satisfies the condition is a= -b -c -d. So the three possible matrices are:

[tex]\left(\begin{array}{cc} 1 & a\\ 0 & 0 \end{array}\right)[/tex], [tex]\left(\begin{array}{cc} 0 & 0\\ 1 & a \end{array}\right)[/tex] and [tex]\left(\begin{array}{cc} 0 & 0\\ 0 & 0 \end{array}\right)[/tex]

is that correct?
 
  • #8
22,089
3,291
Seems correct!
 
  • #9
14
0
I was limiting my thinking to the most obvious cases of existence of such matrices. many thanx for the help; By now i have only one more doubt about my current exercise set. can i post on this topic?

I'll better read on the forum rules;
 
  • #10
22,089
3,291
I don't think the rules say anything about such a things. So you can post it here if you want :smile:
 
  • #11
14
0
oh thanx...
The problem
Consider the system of equations AX = 0 where
[tex]A = \left(\begin{array}{cc} a & b\\ c & d \end{array}\right)[/tex]
is a 2x2 matrix over the field F. Prove the following

(a) If every entry of A is 0 then every pair (x1, x2) is a solution of AX = 0;
(b) if ad-bc != 0, the system AX = 0 has only the trivial solution x1 = x2 =0;

Attempt to solution:
(a) is pretty straightforward. I just show the form of the system {0*x1 + 0*x2 = 0; 0*x1 + 0*x2 = 0
and by the real numbers properties i conclude that any (x1, x2) satisfies the equation.

(b) let me in doubt. So far in the book, the word determinant even is mentioned. So i can't use this as a reasonable argument. I claim the follow:

if ad = bc then i can write a + b = k*(d+ c). By this way the process to make a row-reduced diagonal matrix will fail, resulting in other non-zero solutions. But the problem is exactly on prove the first statement (ad = bc implies that a+ b = k*(d+c)). How can i state this in a mathematical manner?
 
Last edited:
  • #12
14
0
Analysing the first question I found an inconsistency. The matrix could be, say:

[tex]\left(\begin{array}{cc} 1 & b\\ 0 & 0 \end{array}\right)[/tex]

Is that a problem? because from the point of view of matrices, these two are different.
 
  • #13
22,089
3,291
oh thanx...
The problem
Consider the system of equations AX = 0 where
[tex]A = \left(\begin{array}{cc} a & b\\ c & d \end{array}\right)[/tex]
is a 2x2 matrix over the field F. Prove the following

(a) If every entry of A is 0 then every pair (x1, x2) is a solution of AX = 0;
(b) if ad-bc != 0, the system AX = 0 has only the trivial solution x1 = x2 =0;

Attempt to solution:
(a) is pretty straightforward. I just show the form of the system {0*x1 + 0*x2 = 0; 0*x1 + 0*x2 = 0
and by the real numbers properties i conclude that any (x1, x2) satisfies the equation.

(b) let me in doubt. So far in the book, the word determinant even is mentioned. So i can't use this as a reasonable argument. I claim the follow:

if ad = bc then i can write a + b = k*(d+ c). By this way the process to make a row-reduced diagonal matrix will fail, resulting in other non-zero solutions. But the problem is exactly on prove the first statement (ad = bc implies that a+ b = k*(d+c)). How can i state this in a mathematical manner?
Well, for (b), can't you just solve the system of equations by row reducing things? Just try to reduce the matrix and you'll see where ad-bc comes in!
 
  • #14
Hurkyl
Staff Emeritus
Science Advisor
Gold Member
14,916
19
An mxn matrix R is called row-reduced if:

a) the first non-zero entry in each non-zero row of R is equal to 1;
b) each column of R which contains the leading non zero entry of some row has all its other entries 0.
Oh weird. I would have expected "row-reduced" to include that the ones are moved to the top and sorted; e.g.
[tex]\left( \begin{matrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 0 \end{matrix} \right)[/tex]
is row reduced, but neither of the following are:
[tex]\left( \begin{matrix} 0 & 0 & 0 \\ 1 & 0 & 0 \\ 0 & 1 & 0 \end{matrix} \right)[/tex]
[tex]\left( \begin{matrix} 0 & 1 & 0 \\ 1 & 0 & 0 \\ 0 & 0 & 0 \end{matrix} \right)[/tex]
 
  • #15
14
0
Oh weird. I would have expected "row-reduced" to include that the ones are moved to the top and sorted; e.g.
[tex]\left( \begin{matrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 0 \end{matrix} \right)[/tex]
is row reduced, but neither of the following are:
[tex]\left( \begin{matrix} 0 & 0 & 0 \\ 1 & 0 & 0 \\ 0 & 1 & 0 \end{matrix} \right)[/tex]
[tex]\left( \begin{matrix} 0 & 1 & 0 \\ 1 & 0 & 0 \\ 0 & 0 & 0 \end{matrix} \right)[/tex]
as micromass stated, it is has to do with the definition that you are using... in this case the book (HOFFMAN - Linear Algebra) states as being the rows at any order.
 
  • #16
14
0
great tip micromass! thanx...

I dont know, but maybe i just have to look at the problem at the simplest manner possible...

the first thing that i did before was:
[tex](1) -> A = \left( \begin{matrix} 1 & b/a \\ c & d \end{matrix} \right)[/tex]
by doing so, i was stuck and nothing could be done anymore.
 

Related Threads on Prove the existence of row-reduced matrices with restrictions

  • Last Post
Replies
5
Views
9K
Replies
4
Views
10K
Replies
7
Views
5K
  • Last Post
Replies
1
Views
4K
  • Last Post
Replies
4
Views
2K
  • Last Post
Replies
2
Views
756
Replies
1
Views
991
  • Last Post
Replies
1
Views
2K
  • Last Post
Replies
1
Views
1K
Top