Let's say ##a## and ##b## are those polynomials for short.
iJake said:
Homework Statement
1.
(a) Prove that the following is a linear transformation:
##\text{T} : \mathbb k[X]_n \rightarrow \mathbb k[X]_{n+1}##
##\text{T}(a_0 + a_1X + \ldots + a_nX^n) = a_0X + \frac{a_1}{2}X^2 + \ldots + \frac{a_n}{n+1}##
##\text{Find}## ##\text{Ker}(T)## and ##\text{Im}(T)##
(b) If ##\text{D} : \mathbb R[X]_{n+1} \rightarrow \mathbb R[X]_{n}##
##D(p) = p'##
and ##T : \mathbb R[X]_{n} \rightarrow \mathbb R[X]_{n+1}## is the transformation from part (a), prove that
##D \circ T = \text{id}## but that ##T \circ D \neq \text{id}##
2.
(a) Let ##V## be an ##\mathbb R##-vector space and ##j : V \rightarrow V## a linear transformation such that ##j \circ j = id_V##. Now, let
##S = \{v \in V : j(v) = v\}## and ##A = \{v \in V : j(v) = -v\}##
Prove that ##S## and ##A## are subspaces and that ##V = S \oplus A##.
(b) Deduce that the decomposition of the matrices in direct sum from the symmetric and skew-symmetric matrices from part (a) (finding a convenient linear transformation ##j##)
[apologies if that last part is a bit weird sounding, I'm translating from Spanish]
Homework Equations
---
The Attempt at a Solution
1a)
##T(a_0 + a_1X + \ldots + a_nX^n + \ldots + b_0 + b_1X + \ldots + b_nX^n) = a_0X + \frac{a_1}{2}X^2 + \ldots + \frac{a_n}{n+1} + \ldots + b_0X + \frac{b_1}{2}X^2 + \ldots + \frac{b_n}{n+1} = T(a_0X + \frac{a_1}{2}X^2 + \ldots + \frac{a_n}{n+1}) + T(b_0X + \frac{b_1}{2}X^2 + \ldots + \frac{b_n}{n+1})##
Some cut and paste errors, e.g. an ##X^{n+1}## is missing at the last term of ##a## and ##b##. You've written ##T(T(a))## and ##T(T(b))## on the RHS. But more severe is, that you already used additivity in the first step. You have to write ##T(a+b)=T(a_0+\ldots +b_0+\ldots)=T((a_0+b_0)+\ldots)## and then apply the definition of the operator, since you only have the formula for already added polynomial. On the RHS you have ##(a_0+b_0)X+\ldots = a_0X+\ldots +b_0X+\ldots = T(a)+T(b)##. This is nit-picking, but the entire exercise is only a question of accuracy.
##c \cdot (a_0 + a_1X + \ldots + a_nX^n) = ca_0 + ca_1X + \ldots + ca_nX^n##
##T(c \cdot (a_0 + a_1X + \ldots + a_nX^n) = T(ca_0 + ca_1X + \ldots + ca_nX^n) = (c \cdot a_0X) + (c \cdot \frac{a_1}{2}X^2) + \ldots + (c \cdot \frac{a_n}{n+1}) = c \cdot T(a_0 + a_1X + \ldots + a_nX^n)##
You could add some steps here, too, but it's correct. A bracket is missing in the first term.
I conclude that ##T## is a linear transformation.
Yes. Integration is linear.
However, I'm not really sure how to find ##\text{Ker}(T)## and ##\text{Im}(T)## . For ##\text{Ker}(T)## for example, would it simply be something along the lines of ##T | a_n = 0 \forall a \in \mathbb k## ? Forgive me if this is a foolish question.
It's not a foolish question, it's a matter of practice. ##\operatorname{ker}(T)=\{\,a\, : \,T(a)=a_0X + \frac{a_1}{2}X^2 + \ldots + \frac{a_n}{n+1}X^{n+1}=0\,\}##. Now when is a polynomial the zero polynomial? Similar for the image. ##\operatorname{im}(T)=\{\,b=b_0+b_1X+\ldots + b_mX^m\, : \,b=T(a)\,\}## and then you can write the conditions of possible ##b_i\,.##
Part B has me confused, but mostly because I don't know how to evaluate ##D##. How does ##D(p)## relate to the form of the linear transformation I was given in part (a)?
##D## is the differentiation and ##T## the integration. You could just start with an example, say ##a= 2+3x+4x^2## and calculate ##D(a)## and ##T(a)## and see what happens to the coefficients ##(2,3,4)##
As for question 2)
a) The test for ##S## and ##A## being subspaces is fairly trivial so I don't include it. Now, to determine that ##V = S \oplus A## I'm finding it a little trickier. I observe clearly that ##S \cap A = \{0\}## but how do I formalize that and lead into it proving that V is the direct sum of S and A?
We have ##j^2=1## which means ##j(j(v))=v##. Now consider ##u=j(v)+v## and ##w=j(v)-v##.
b) is also confusing me. I found
this and it looks remarkably similar to my problem, but I do not know how to apply it here.
Symmetric and skew-symmetric indicates, that it must have something to do with matrices ##A## and ##A^\tau##. Now use a similar trick as before and define ##A\pm A^\tau## for an arbitrary matrix ##A##.