Prove the existence of row-reduced matrices with restrictions

  • Thread starter Thread starter rockerman
  • Start date Start date
  • Tags Tags
    Existence Matrices
rockerman
Messages
13
Reaction score
0
Let A = [a b; c d] a 2x2 matrix with complex entries. Suppose that A is row-reduced and also that a+b+c+d =0 . Prove that there are exactly three such matrices...



so i realize that there are seven possible 2x2 matrices that are row-reduced.
[1 0; 0 1], [0 1; 1 0], [0 0; 1 0], [0 0;0 1], [1 0; 0 0], [0 1; 0 0], [0 0; 0 0]...
and the only one that satisfies the restriction is the last. What am i missing? thx


This is the problem 6 of the section 1.3 of HOFFMAN and KUNZE - Linear Algebra.
 
Physics news on Phys.org
Hi rockerman! :smile:

How did you define a row-reduced matrix?

I ask this because I highly doubt that

1) [1 0; 0 1], [0 1; 1 0], [0 0; 1 0], [0 0;0 1], [1 0; 0 0], [0 1; 0 0], [0 0; 0 0] are the only row-reduced matrices. In fact, I think there are infinitely many of them.

2) I don't think [0 1; 1 0], [0 0; 1 0], [0 0;0 1] even are row-reduced.

But all of this depends on the definition you're using. So, what is it?
 
so..

the row reduced definition provided by the book is given as follows:

An mxn matrix R is called row-reduced if:

a) the first non-zero entry in each non-zero row of R is equal to 1;
b) each column of R which contains the leading non zero entry of some row has all its other entries 0.

This way, i think that only exists seven row-reduced matrices in R^(2x2).. but I'm not figuring out how to prove the existence of three such matrices..
 
What about matrices like

\left(\begin{array}{cc} 1 & a\\ 0 & 0 \end{array}\right)

Isn't that row-reduced too?
 
oh, how I've missed this...


so there are many row-reduced matrices like that... but how i will find the three matrices that satisfies the condition?
 
Choose a special a...
 
the a that satisfies the condition is a= -b -c -d. So the three possible matrices are:

\left(\begin{array}{cc} 1 & a\\ 0 & 0 \end{array}\right), \left(\begin{array}{cc} 0 & 0\\ 1 & a \end{array}\right) and \left(\begin{array}{cc} 0 & 0\\ 0 & 0 \end{array}\right)

is that correct?
 
Seems correct!
 
I was limiting my thinking to the most obvious cases of existence of such matrices. many thanks for the help; By now i have only one more doubt about my current exercise set. can i post on this topic?

I'll better read on the forum rules;
 
  • #10
I don't think the rules say anything about such a things. So you can post it here if you want :smile:
 
  • #11
oh thanx...
The problem
Consider the system of equations AX = 0 where
A = \left(\begin{array}{cc} a & b\\ c & d \end{array}\right)
is a 2x2 matrix over the field F. Prove the following

(a) If every entry of A is 0 then every pair (x1, x2) is a solution of AX = 0;
(b) if ad-bc != 0, the system AX = 0 has only the trivial solution x1 = x2 =0;

Attempt to solution:
(a) is pretty straightforward. I just show the form of the system {0*x1 + 0*x2 = 0; 0*x1 + 0*x2 = 0
and by the real numbers properties i conclude that any (x1, x2) satisfies the equation.

(b) let me in doubt. So far in the book, the word determinant even is mentioned. So i can't use this as a reasonable argument. I claim the follow:

if ad = bc then i can write a + b = k*(d+ c). By this way the process to make a row-reduced diagonal matrix will fail, resulting in other non-zero solutions. But the problem is exactly on prove the first statement (ad = bc implies that a+ b = k*(d+c)). How can i state this in a mathematical manner?
 
Last edited:
  • #12
Analysing the first question I found an inconsistency. The matrix could be, say:

\left(\begin{array}{cc} 1 & b\\ 0 & 0 \end{array}\right)

Is that a problem? because from the point of view of matrices, these two are different.
 
  • #13
rockerman said:
oh thanx...
The problem
Consider the system of equations AX = 0 where
A = \left(\begin{array}{cc} a & b\\ c & d \end{array}\right)
is a 2x2 matrix over the field F. Prove the following

(a) If every entry of A is 0 then every pair (x1, x2) is a solution of AX = 0;
(b) if ad-bc != 0, the system AX = 0 has only the trivial solution x1 = x2 =0;

Attempt to solution:
(a) is pretty straightforward. I just show the form of the system {0*x1 + 0*x2 = 0; 0*x1 + 0*x2 = 0
and by the real numbers properties i conclude that any (x1, x2) satisfies the equation.

(b) let me in doubt. So far in the book, the word determinant even is mentioned. So i can't use this as a reasonable argument. I claim the follow:

if ad = bc then i can write a + b = k*(d+ c). By this way the process to make a row-reduced diagonal matrix will fail, resulting in other non-zero solutions. But the problem is exactly on prove the first statement (ad = bc implies that a+ b = k*(d+c)). How can i state this in a mathematical manner?

Well, for (b), can't you just solve the system of equations by row reducing things? Just try to reduce the matrix and you'll see where ad-bc comes in!
 
  • #14
rockerman said:
An mxn matrix R is called row-reduced if:

a) the first non-zero entry in each non-zero row of R is equal to 1;
b) each column of R which contains the leading non zero entry of some row has all its other entries 0.
Oh weird. I would have expected "row-reduced" to include that the ones are moved to the top and sorted; e.g.
\left( \begin{matrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 0 \end{matrix} \right)
is row reduced, but neither of the following are:
\left( \begin{matrix} 0 & 0 & 0 \\ 1 & 0 & 0 \\ 0 & 1 & 0 \end{matrix} \right)
\left( \begin{matrix} 0 & 1 & 0 \\ 1 & 0 & 0 \\ 0 & 0 & 0 \end{matrix} \right)
 
  • #15
Hurkyl said:
Oh weird. I would have expected "row-reduced" to include that the ones are moved to the top and sorted; e.g.
\left( \begin{matrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 0 \end{matrix} \right)
is row reduced, but neither of the following are:
\left( \begin{matrix} 0 & 0 & 0 \\ 1 & 0 & 0 \\ 0 & 1 & 0 \end{matrix} \right)
\left( \begin{matrix} 0 & 1 & 0 \\ 1 & 0 & 0 \\ 0 & 0 & 0 \end{matrix} \right)

as micromass stated, it is has to do with the definition that you are using... in this case the book (HOFFMAN - Linear Algebra) states as being the rows at any order.
 
  • #16
great tip micromass! thanx...

I don't know, but maybe i just have to look at the problem at the simplest manner possible...

the first thing that i did before was:
(1) -> A = \left( \begin{matrix} 1 & b/a \\ c & d \end{matrix} \right)
by doing so, i was stuck and nothing could be done anymore.
 
Back
Top