Notation used in matrix representation of linear transformation

In summary: But now I understand that it's just a matter of which basis is written first. So it's just a matter of preference, I guess. In summary, the conversation is discussing the notation ##[T]_{B,B'}##, which represents the matrix of a linear transformation with respect to two different bases. The order of the bases in the notation does not matter, as long as it is consistent. The matrix can be found by expressing the linear transformation in terms of the basis vectors and arranging the coefficients in a matrix.
  • #1
Seydlitz
263
4
Hello guys,

Let ##T: \mathbb{R^2} \to \mathbb{R^2}##. Suppose I have standard basis ##B = \{u_1, u_2\}## and another basis ##B^{\prime} = \{v_1, v_2\}## The linear transformation is described say as such ##T(v_1) = v_1 + v_2, T(v_2) = v_1##

If I want to write the matrix representing ##T## with respect to basis ##B^{\prime}## then I'll just find ##[T]_{B'}##. I can also find ##[T]_{B}## rather straightforward using similarity transformation if I know the transition matrix between those two bases.

But suddenly I encounter this notation ##[T]_{B,B'}##. I don't know exactly what this notation represents. Do you guys know what this notation mean? What other matrix should I provide in this case? Normally I use that comma subscript to denote transition matrix between bases, but never for linear transformation matrix.

Thank You
 
Physics news on Phys.org
  • #2
It means that you take the basis ##B^\prime## on the domain and ##B## on the codomain (both are ##\mathbb{R}^2##). Or the other way around, depending on who is using the notation.

So the idea is to see what happens to ##v_i##. So look at ##T(v_1)## and ##T(v_2)## and express these in the ##\{u_1,u_2\}## basis. So you write ##T(v_1) = \alpha u_1 + \beta u_2## and ##T(v_2) = \gamma u_1 + \delta u_2##. The the matrix you seek is

[tex]\left(\begin{array}{cc}
\alpha & \gamma\\
\beta & \delta
\end{array}\right)[tex]
 
  • #3
micromass said:
It means that you take the basis ##B^\prime## on the domain and ##B## on the codomain (both are ##\mathbb{R}^2##). Or the other way around, depending on who is using the notation.

So the idea is to see what happens to ##v_i##. So look at ##T(v_1)## and ##T(v_2)## and express these in the ##\{u_1,u_2\}## basis. So you write ##T(v_1) = \alpha u_1 + \beta u_2## and ##T(v_2) = \gamma u_1 + \delta u_2##. The the matrix you seek is

[tex]\left(\begin{array}{cc}
\alpha & \gamma\\
\beta & \delta
\end{array}\right)[/tex]

Thanks micromass for the help. It makes sense. I managed to get that matrix by post-multiplying ##[T]_b## with the transition matrix ##P_{B' \to B}##. I was just really confused because one of the text that I'm reading apparently got the matrix wrong. (Not considering the fact that they use comma and arrow notation interchangeably)
 
  • #4
Seydlitz said:
Thanks micromass for the help. It makes sense. I managed to get that matrix by post-multiplying ##[T]_b## with the transition matrix ##P_{B' \to B}##.

That works too.

I was just really confused because one of the text that I'm reading apparently got the matrix wrong. (Not considering the fact that they use comma and arrow notation interchangeably)

What did the text say?
 
  • #5
micromass said:
That works too.

What did the text say?

It's an example problem. The desired matrix is just the same with my own work, but somehow transposed.
 

What is a matrix representation of a linear transformation?

A matrix representation of a linear transformation is a way of representing a linear transformation using a matrix. It converts the inputs and outputs of a linear transformation into vectors and matrices, making it easier to perform calculations and visualize the transformation.

What is the purpose of using notation in a matrix representation of a linear transformation?

The purpose of using notation in a matrix representation of a linear transformation is to simplify and organize the representation of the transformation. By using symbols and standard notation, it becomes easier to manipulate and analyze the linear transformation and its properties.

What are the common notations used in a matrix representation of a linear transformation?

The common notations used in a matrix representation of a linear transformation include matrices, vectors, subscripts, coefficients, and operations such as addition and multiplication. These notations are used to represent the inputs, outputs, and operations involved in the linear transformation.

How does the matrix representation of a linear transformation relate to the geometric representation?

The matrix representation of a linear transformation is closely related to its geometric representation. The columns of the transformation matrix correspond to the images of the standard basis vectors, which can be visualized as arrows in the input space. The transformation of any vector can then be obtained by multiplying the matrix by the vector, resulting in a new vector that represents the transformation in the output space.

How do you determine the matrix representation of a linear transformation?

To determine the matrix representation of a linear transformation, you first need to identify the standard basis vectors in the input space and their corresponding images in the output space. Then, you can construct a matrix by arranging the images of the standard basis vectors as columns. This matrix will represent the linear transformation in the given basis.

Similar threads

  • Linear and Abstract Algebra
Replies
1
Views
823
  • Linear and Abstract Algebra
Replies
3
Views
1K
  • Linear and Abstract Algebra
Replies
4
Views
1K
  • Linear and Abstract Algebra
Replies
1
Views
429
  • Linear and Abstract Algebra
Replies
8
Views
1K
  • Linear and Abstract Algebra
Replies
20
Views
2K
  • Linear and Abstract Algebra
Replies
12
Views
1K
Replies
12
Views
3K
Replies
4
Views
1K
  • Linear and Abstract Algebra
Replies
6
Views
1K
Back
Top