Linear Combination Mapping: Is the Invertible Matrix Theorem True or False?

In summary, the statement that if the linear combination x -> Ax maps Rn into Rn, then the row reduced echelon form of A is I is false. This is because the statement is only true if the transformation maps Rn onto Rn, not just into Rn. The difference between these two is that onto means the transformation covers every position in Rn, while into means the transformation may not cover every position. It is not likely that the transformation whose matrix is given in the second part of the conversation will row reduce to the identity matrix. Additionally, the two statements given in the Invertible Matrix Theorem are equivalent because a linear transformation is one-to-one if and only if it has a unique solution for each input
  • #1
henry3369
194
0
True or False:
If the linear combination x -> Ax maps Rn into Rn, then the row reduced echelon form of A is I.

I don't understand why this is False. My book says it is false because it is only true if it maps Rn ONTO Rn instead of Rn INTO Rn. What difference does the word into make?
 
Physics news on Phys.org
  • #2
henry3369 said:
True or False:
If the linear combination x -> Ax maps Rn into Rn, then the row reduced echelon form of A is I.

I don't understand why this is False. My book says it is false because it is only true if it maps Rn ONTO Rn instead of Rn INTO Rn. What difference does the word into make?
Consider the transformation whose matrix looks like this:
$$A =
\begin{bmatrix}1 & 0 & 0 & \dots & 0 \\
0 & 0 & 0 & \dots & 0 \\
\vdots \\
0 & 0 & 0 & \dots & 0
\end{bmatrix}$$
This transformation maps a vector x to its projection on the ##x_1## axis, a map from Rn into Rn (but not onto Rn). Does it seem likely to you that this matrix will row reduce to the identity matrix?
 
  • #3
Mark44 said:
Consider the transformation whose matrix looks like this:
$$A =
\begin{bmatrix}1 & 0 & 0 & \dots & 0 \\
0 & 0 & 0 & \dots & 0 \\
\vdots \\
0 & 0 & 0 & \dots & 0
\end{bmatrix}$$
This transformation maps a vector x to its projection on the ##x_1## axis, a map from Rn into Rn (but not onto Rn). Does it seem likely to you that this matrix will row reduce to the identity matrix?
So if it maps onto Rn does that mean that it maps x into every position in Rn?
 
  • #4
Also can you clarify why these two statements are equivalent in the Invertible Matrix Theorem::
1. The equation Ax = b has at least one solution for each b in Rn
2. The linear transformation x -> Ax is one-to-one.

What is throwing me off is the word least in statement 1. Shouldn't it have exactly one solution for every b in order for the transformation to be one-to-one?
 
  • #5
henry3369 said:
Also can you clarify why these two statements are equivalent in the Invertible Matrix Theorem::
1. The equation Ax = b has at least one solution for each b in Rn
2. The linear transformation x -> Ax is one-to-one.

What is throwing me off is the word least in statement 1. Shouldn't it have exactly one solution for every b in order for the transformation to be one-to-one?
Yes, I'm bothered by it as well. It's technically correct, but misleading, as it seems to imply that for some b there might two input x values. That can't happen if the transformation is one-to-one, though.

To me, it's sort of like saying 3 + 4 is at least 7.
 
  • #6
Yes, this is rank-nullity, together with the fact that a linear map is injective iff it has a trivial kernel.
 

Related to Linear Combination Mapping: Is the Invertible Matrix Theorem True or False?

What is the Invertible Matrix Theorem?

The Invertible Matrix Theorem is a theorem in linear algebra that states that a square matrix is invertible if and only if its determinant is non-zero.

Why is the Invertible Matrix Theorem important?

The Invertible Matrix Theorem is important because it provides a necessary and sufficient condition for a matrix to have an inverse, which is crucial in many applications of linear algebra, such as solving systems of equations and calculating eigenvalues.

How is the Invertible Matrix Theorem used in real life?

The Invertible Matrix Theorem is used in various fields such as engineering, physics, and economics to solve problems involving matrices, such as finding solutions to linear systems of equations and determining the stability of systems.

What are the implications of the Invertible Matrix Theorem?

The Invertible Matrix Theorem has several important implications, including the fact that a matrix is invertible if and only if it has a unique solution, and that the columns of an invertible matrix are linearly independent.

Are there any limitations to the Invertible Matrix Theorem?

Yes, the Invertible Matrix Theorem only applies to square matrices, meaning matrices with an equal number of rows and columns. It also does not apply to matrices with complex numbers, as the determinant must be a real number for the theorem to hold.

Similar threads

  • Linear and Abstract Algebra
Replies
9
Views
4K
  • Linear and Abstract Algebra
Replies
8
Views
944
  • Linear and Abstract Algebra
Replies
3
Views
1K
  • Linear and Abstract Algebra
Replies
4
Views
953
  • Linear and Abstract Algebra
Replies
2
Views
2K
  • Linear and Abstract Algebra
Replies
15
Views
4K
  • Calculus and Beyond Homework Help
Replies
10
Views
2K
  • Linear and Abstract Algebra
Replies
3
Views
2K
  • Linear and Abstract Algebra
Replies
6
Views
1K
  • Calculus and Beyond Homework Help
Replies
3
Views
931
Back
Top