Matrix representation of a linear mapping

Click For Summary

Homework Help Overview

The discussion revolves around the matrix representation of a linear mapping, specifically focusing on the transformation between different bases in vector spaces. Participants are exploring the implications of using the identity transformation matrix for changing bases and the properties of invertible matrices in this context.

Discussion Character

  • Conceptual clarification, Mathematical reasoning, Assumption checking

Approaches and Questions Raised

  • Participants discuss the relationship between linear mappings and their matrix representations, questioning how to utilize the invertibility of the transformation matrix. There are suggestions to explore inverse formulas related to change of basis. Some participants express uncertainty about the focus of the discussion, considering whether to concentrate on specific mappings or the broader vector space itself.

Discussion Status

The conversation is ongoing, with participants sharing thoughts and uncertainties about the proof of the statements regarding linear mappings and their matrix representations. There is an acknowledgment of the need for technical work to establish the relationships discussed, and some guidance has been offered regarding the exploration of basis vectors and their transformations.

Contextual Notes

Participants are navigating the complexities of linear transformations and their representations, with some noting the distinction between specific mappings and the vector spaces involved. There is a recognition of the need for clarity in the definitions and assumptions being used in the discussion.

JD_PM
Messages
1,125
Reaction score
156
Homework Statement
Is the following statement true or false? If it is the former case, prove it. If it is later, give a counterexample.

Let ##n \in \aleph_0## and ##L:\Re^{n} \rightarrow \Re^{n}## be an injective linear mapping. Let ##A \in \Re^{n \times n}## be an invertible matrix. Then there is a basis ##\alpha## of ##\Re^{n}## and a basis ##\beta## of ##\Re^{n}## such that ##A = L_{\alpha}^{\beta}##
Relevant Equations
Please check out diagram
I know that to go from a vector with coordinates relative to a basis ##\alpha## to a vector with coordinates relative to a basis ##\beta## we can use the matrix representation of the identity transformation: ##\Big( Id \Big)_{\alpha}^{\beta}##.

This can be represented by a diagram:

Screenshot (977).png


Thus note that the linear mapping we are interested in is ##A:X \rightarrow X'##, where:

$$A = \Big( Id \Big)_{\alpha}^{\beta}$$

I think that the statement is true but I think I should use the fact that ##A## is invertible somehow on the above equation in order to prove it. But how?
 
Last edited:
Physics news on Phys.org
I think it is true (not entirely sure) , but I think it will take some technical work to prove it. Try to play around with the formula's for change of base and try to come up with "inverse" formulas for them.
 
  • Like
Likes   Reactions: JD_PM
JD_PM said:
Thus note that the linear mapping we are interested in is ##A:X \rightarrow X'##, where:

$$A = \Big( Id \Big)_{\alpha}^{\beta}$$

I think that the statement is true but I think I should use the fact that ##A## is invertible somehow on the above equation in order to prove it. But how?
Interested in for what? How does ##L## fit in here?
 
  • Like
Likes   Reactions: JD_PM
Math_QED said:
I think it is true (not entirely sure) , but I think it will take some technical work to prove it. Try to play around with the formula's for change of base and try to come up with "inverse" formulas for them.

Alright thanks, I'll think about it and post what I get.
 
vela said:
Interested in for what?

I thought ##A:X \rightarrow X'## was the linear mapping we were interested in because it is the one that involves elements of ##\Re^{n}##.

But recently I've been thinking that I shouldn't focus on the elements but in the vector space ##\Re^{n}## itself. In other words: I think I should focus on ##L:\Re^{n} \rightarrow \Re^{n}## instead of ##A:X \rightarrow X'##.
 
Given a linear function, F(X), from R^n->R^m, and bases for both R^n and R^m, then there is a unique matrix with m rows and n columns that represents that function. To find the entries in the matrix, apply the function, F, to each basis vector of R^n in turn, writing the result as a linear combination of the basis vectors of R^m. The coefficients of each such linear combination give one column of the mtatrix.
 
HallsofIvy said:
Given a linear function, F(X), from R^n->R^m, and bases for both R^n and R^m, then there is a unique matrix with m rows and n columns that represents that function. To find the entries in the matrix, apply the function, F, to each basis vector of R^n in turn, writing the result as a linear combination of the basis vectors of R^m. The coefficients of each such linear combination give one column of the mtatrix.

The question does not ask this, but rather the converse. Can you find bases such that a linear transformation has a given matrix?
 

Similar threads

Replies
15
Views
2K
Replies
5
Views
1K
  • · Replies 8 ·
Replies
8
Views
1K
  • · Replies 1 ·
Replies
1
Views
3K
Replies
4
Views
2K
Replies
8
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 16 ·
Replies
16
Views
2K
  • · Replies 10 ·
Replies
10
Views
2K