Trouble understanding linear transformations in this context

Eclair_de_XII
Messages
1,082
Reaction score
91

Homework Statement


"Show that every subspace of ##ℝ^n## is the set of solutions to a homogeneous system of linear equations. (Hint: If a subspace ##W## consists of only the zero vector or is all of ##ℝ^n##, ##W## is the set of solutions to ##IX=0## or ##0_vX=0##, respectively.

Assume ##W## is not one of these two subspaces. Let ##β=\{v_1,...,v_k\}## for ##W##. Extend this basis to ##β`=β∪\{v_{k+1},...,v_n\}## to span all of ##ℝ^n##. Let ##T: ℝ^n → ℝ^n## be the linear transformation so that ##T(v_1)=T(v_2)=...=T(v_k)=0## and ##T(v_j)=I(v_j)## for ##k+1≤j≤n##, where ##I(v)## is the identity function. Now use ##T## to obtain a matrix A so that ##W## is the set of solutions to the homogeneous system ##AX=0_v##.)"

Homework Equations


##T_\alpha=PT_βP^{-1}##

where ##P## is the transition matrix from basis ##β## to basis ##α##.

The Attempt at a Solution


##T(c_1v_1+...+c_kv_k)=T(c_1v_1)+...+T(c_kv_k)=c_1T(v_1)+...+c_kT(v_k)=0_v##
##T(c_{k+1}v_{k+1}+...+c_nv_n)=c_{k+1}T(v_{k+1})+...+c_nT(v_n)=c_{k+1}v_{k+1}+...+c_nv_n=0_v##

I honestly don't know what I'm doing here, and if anyone would like to provide feedback on what I'm doing wrong, or what I should actually be doing instead of this, that would be much appreciated. Do I have to left-multiply the ##n×n## transformation matrix by a column vector whose entries consist of the constants ##c_i##, proving that the product is the zero vector, and that it solves the homogeneous system? ##\left[T(v)\right]\left[c_i\right]=\left[0_v\right]##. Something like that, maybe? Sorry, and thanks.
 
Last edited:
Physics news on Phys.org
Consider the diagonal matrix \Lambda = \mathrm{diag}(\lambda_1, \dots, \lambda_n) and the matrix P = \begin{pmatrix} v_1 & v_2 & \dots & v_n\end{pmatrix} where \{v_i : i = 1, 2, \dots, n\} is a basis.

What is P\Lambda P^{-1} v_i?
 
I would suggest firstly that you write the matrix form of the equation in the first constructed basis where you have a basis of the subspace W extended to the rest of the space. Try this with say 3 dimensions and W the x-y plane. Work out the details of the concrete example and then see if you can understand the generalization.

Once you have the system of equations in the original basis you transform to the arbitrary basis with the T matrix.
 
pasmith said:
What is ##P\Lambda P^{-1} v_i##?

It looks a solution to ##\Lambda v_i## using a different basis, I think

jambaugh said:
Try this with say 3 dimensions and W the x-y plane.

Okay, so...

Let ##W⊆ℝ^2## be spanned by ##\beta=\{b_1,b_2\}##. Then let ##\beta'=\beta∪\{y\}##.

Let ##P=\left[b_1|b_2|y\right] = \begin{pmatrix}
a & d & x \\
b & e & y \\
c & f & z \end{pmatrix}##.

Then I apply the transformation matrix,

##T=\begin{pmatrix}
0 & 0 & 0 \\
0 & 0 & 0 \\
0 & 0 & 1 \\
\end{pmatrix}##

So... ##PT=\begin{pmatrix}
0 & 0 & x \\
0 & 0 & y \\
0 & 0 & z \\
\end{pmatrix}##
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top