Is the linear mapping T(x) one to one?

  • Thread starter Thread starter rmiller70015
  • Start date Start date
  • Tags Tags
    Linear Mapping
Click For Summary
The discussion revolves around determining if the linear transformation T, defined by a specific matrix A, is one-to-one. Initial analysis using row reduction suggests that the columns of A are linearly independent, leading to the conclusion that T is one-to-one. However, further examination reveals that one of the vectors is a linear combination of others, indicating linear dependence and thus questioning the one-to-one nature of T. Participants clarify the definitions of linear independence and dependence, emphasizing that the focus should be on the row or column vectors of the matrix rather than the matrix itself. Ultimately, the consensus is that T is not one-to-one due to the linear dependence among the vectors.
rmiller70015
Messages
110
Reaction score
1

Homework Statement


This is for a linear algebra class, but it's taught my mathematicians, for mathematicians and not physicists or engineers so we write pseudo-proofs to explain things.

In Exercises 37–40, let T be the linear transformation whose standard matrix is given. In Exercises 37 and 38, decide if T is a one-to-one mapping.

38. ##A = \begin{bmatrix}
7&5&4&-9\\
10&6&16&-4\\
12&8&12&7\\
-8&-6&-2&5\\
\end{bmatrix}##

Homework Equations


Theorem 9:
upload_2018-1-28_14-43-38.png


Theorem 12:
upload_2018-1-28_14-43-52.png


The Attempt at a Solution


I used mathematica to row reduce the matrix and I got:
##A = \begin{bmatrix} 1&0&7&0\\
0&1&-9&0\\
0&0&0&1\\
0&0&0&0\\
\end{bmatrix}##
Let T: ##\mathbb{R}^m \rightarrow \mathbb{R}^n## such that ##T(\bar{x}) = A\bar{x}##, where ##A=[\bar{a_1} \bar{a_2} \bar{a_3} \bar{a_4}]##

By Theorem 12, ##T(\bar{x})## is one to one if the columns of ##A## are linearly independent.
Let there be an indexed set of vectors ##S_A = \{\bar{a_1}, \bar{a_2}, \bar{a_3}, \bar{a_4}\}##, then by theorem 9 ##S_A## is linearly dependent if ##\bar{0} \in S_A##, but ##\bar{0} \notin S_A##, so ##S_A## is linearly independent.
Therefore, ##T(x)## is one to one.

TL;DR My answer is that the mapping is one to one, but I am not sure if that is true or not.
 

Attachments

  • upload_2018-1-28_14-43-38.png
    upload_2018-1-28_14-43-38.png
    2.6 KB · Views: 383
  • upload_2018-1-28_14-43-52.png
    upload_2018-1-28_14-43-52.png
    10.6 KB · Views: 420
Physics news on Phys.org
I personally don't believe that the mapping (judging by the RREF matrix) is one-to-one.

Why don't you try testing Theorem 12 (b)? Can you tell me what the definition of linear independence is?
 
  • Like
Likes rmiller70015
Eclair_de_XII said:
I personally don't believe that the mapping (judging by the RREF matrix) is one-to-one.

Why don't you try testing Theorem 12 (b)? Can you tell me what the definition of linear independence is?

Yeah I see now that ##\bar{a_3}## is a linear combination of ##\bar{a_1}## and ##\bar{a_2}##, so ##\bar{0} \in S_A##.
As a follow up, I like to think in terms of row vectors. Would it be acceptable for me to take the transpose of a matrix and row reduce it to show that there is a zero vector solution?
 
rmiller70015 said:
By Theorem 12, ##T(\bar{x})## is one to one if the columns of ##A## are linearly independent.
Let there be an indexed set of vectors ##S_A = \{\bar{a_1}, \bar{a_2}, \bar{a_3}, \bar{a_4}\}##, then by theorem 9 ##S_A## is linearly dependent if ##\bar{0} \in S_A##, but ##\bar{0} \notin S_A##, so ##S_A## is linearly independent.
Therefore, ##T(x)## is one to one.

TL;DR My answer is that the mapping is one to one, but I am not sure if that is true or not.

Your theorem 9 does not contain "and only if"
A matrix linearly dependent if any linear combination of a proper supset of the rows or the columns is the zero vector.
 

Attachments

  • upload_2018-1-28_23-5-13.png
    upload_2018-1-28_23-5-13.png
    26.4 KB · Views: 599
  • Like
Likes rmiller70015
Theorem 9 is rather strange. Of course it is trivially true, as the zero vector itself is linear dependent: From ##\lambda \cdot \vec{0} =\vec{0}## cannot be concluded ##\lambda = 0##. And if any vectors are added to a set that already contains ##\vec{0}##, then it remains linear dependent. This doesn't say much about the concept of linear independence.
willem2 said:
A matrix linearly dependent if any linear combination of a proper supset of the rows or the columns is the zero vector.
This is a very strange and actually wrong wording. A single matrix is always linear independent, as soon as it isn't the zero matrix. This is also trivially true, because a single vector is always linear independent as long as it isn't the zero vector. To say this about a matrix, the matrix must be considered as a vector, because the concept of linear (in-)dependence applies to vectors only. However, this is completely irrelevant here, as matrices are considered to represent a linear function and not a vector on its own.

So what we are actually talking about are the row or column vectors of the matrix, not the matrix itself.
The rest of the sentence "any linear combination of a proper supset of the rows or the columns is the zero vector" is right away nonsense.

Correct is: Any linear combination of rows or columns ##LC=\lambda_1\cdot v_1+\ldots +\lambda_k \cdot v_k ## (forget the subsets) which results in the zero vector (##LC=\vec{0}##), must necessarily be the trivial linear combination (##\lambda_1 = \ldots = \lambda_k =0##). One can always right the zero vector in this way. The point is, that this has to be the only way to do so, in case of linear independence.
 
Question: A clock's minute hand has length 4 and its hour hand has length 3. What is the distance between the tips at the moment when it is increasing most rapidly?(Putnam Exam Question) Answer: Making assumption that both the hands moves at constant angular velocities, the answer is ## \sqrt{7} .## But don't you think this assumption is somewhat doubtful and wrong?

Similar threads

Replies
5
Views
970
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 4 ·
Replies
4
Views
1K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
9
Views
2K
Replies
8
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K