Linear Algebra (Vector spaces, linear independent subsets, transformations)

Click For Summary

Homework Help Overview

The discussion revolves around a linear algebra problem involving the vector space of polynomials, specifically proving the linear independence of a set of transformations defined by derivatives of polynomials. The original poster expresses confusion about the relationship between the vector space \( V = P(R) \) and the transformations \( T_j(f(x)) = f^j(x) \).

Discussion Character

  • Exploratory, Conceptual clarification, Assumption checking

Approaches and Questions Raised

  • Participants explore different methods to prove linear independence, including contradiction and direct proof. Questions arise about the definitions of the vector space and the transformations, as well as the appropriate basis for \( V \). Some participants suggest evaluating specific polynomials to analyze the implications of linear dependence.

Discussion Status

The discussion is active with various interpretations being explored. Some participants provide guidance on how to approach the proof, while others question the assumptions and definitions involved. There is no explicit consensus on the best method or understanding of the concepts, indicating a productive exploration of the topic.

Contextual Notes

Participants note confusion regarding the basis of the vector space and the implications of linear dependence versus independence. There is also mention of differing methods suggested by professors, which adds to the complexity of the discussion.

jeff1evesque
Messages
312
Reaction score
0
Assignment question:
Let [tex]V = P (R)[/tex] and for j >= 1 define [tex]T_j(f(x)) = f^j (x)[/tex]
where [tex]f^j(x)[/tex] is the jth derivative of f(x). Prove that the
set {[tex]T_1, T_2,..., T_n[/tex]} is a linearly independent subset of [tex]L(V)[/tex]
for any positive integer n.

I have no idea how V= P(R) has anything to do with the rest of the problem- in particular with the transformation [tex]T_j(f(x)) = f^j (x)[/tex]. I guess I just don't understand how V ties in with the definition of the transformation T.

2. Relevant ideas:
I heard two different professors each saying a different method of proving this problem. One said it was possible to prove it by contradiction, and another tried to help me prove it directly.

So when I started the method of contradiction I had:
Let B = {[tex]a_1T_1, a_2T_2, ..., a_nT_n[/tex]}
Assume B is linearly dependent (goal: show a contradiction that its dependent?)
Choose [tex]T_i[/tex] from B such that 1<= i <= n.
Therefore, [tex]T_i = B - a_iT_i[/tex]

From here I couldn't continue. I have to show that the linear combination above (to the right of the equality) takes elements to the same ith derivative- and derive a contradiction here?
----------------

The second method involve find the bases of V. A professor said it was {[tex]1, x/1!, x^2/2!, ..., x^n/n![/tex]}. But that doesn't make sense to me. I thought the basis would be {[tex]0, x, x^2,... x^n[/tex]}.
 
Last edited:
Physics news on Phys.org
I'm going to assume that P(R) is the vector space of polynomials with real coefficients under regular addition and scalar multiplication. If so, what do you mean by L(V)?
 
If the T's are linearly dependent then there is a linear combination a1*T1+a2*T2+...an*Tn that gives you zero when applied to any polynomial p(x), right? Ok, then apply that to specific polynomials and evaluate the result at x=0. Say p(x)=x. T1(x)=1, T2(x)=0, ... What does that tell you about a1? Suppose p(x)=x^2. T1(x^2)=2x, T2(x^2)=2, T2(x^2)=0. ... Now set x=0. What does that tell you about a2? Etc. Etc.
 
foxjwill said:
I'm going to assume that P(R) is the vector space of polynomials with real coefficients under regular addition and scalar multiplication. If so, what do you mean by L(V)?

V=P(R). I think L(V) is linear transformations from V->V.
 
foxjwill said:
I'm going to assume that P(R) is the vector space of polynomials with real coefficients under regular addition and scalar multiplication. If so, what do you mean by L(V)?

My notes from class says "it is the set of linear operations on V".
 
Dick said:
If the T's are linearly dependent then there is a linear combination a1*T1+a2*T2+...an*Tn that gives you zero when applied to any polynomial p(x), right? Ok, then apply that to specific polynomials and evaluate the result at x=0. Say p(x)=x. T1(x)=1, T2(x)=0, ... What does that tell you about a1? Suppose p(x)=x^2. T1(x^2)=2x, T2(x^2)=2, T2(x^2)=0. ... Now set x=0. What does that tell you about a2? Etc. Etc.

Dick, you are saying the exact same thing one of my professors was saying to me. However I don't understand what you mean "that give you zero when applied to any polynomial p(x)...


Thanks,


JL
 
They are operators in L(V). If they are linearly dependent then there's a linear combination of those operators that is IDENTICALLY zero. I.e. the combination of those operators applied to anything in the space is zero. That's what linearly dependent means. If you can show any combination must be nonzero someplace, then you've shown they are linearly independent.
 
Last edited:
Dick said:
They are operators in L(V). If they are linearly dependent then there's a linear combination of those operators that is IDENTICALLY zero. I.e. the combination of those operators applied to anything in the space is zero. That's what linearly dependent means. If you can show any combination must be nonzero someplace, then you've shown they are linearly independent.

Oh so p(x) is defined by P(R), where V = P(R)? But how do you know p(x) = x, how do you know the domain equals the image?

Thanks for the help- it takes me awhile to learn these things,


JL
 
S'ok. I don't know p(x)=x. p(x) could be anything. I'm just picking examples for p(x) that I can use to show the various a's must be zero.
 
  • #10
Dick said:
S'ok. I don't know p(x)=x. p(x) could be anything. I'm just picking examples for p(x) that I can use to show the various a's must be zero.

Oh ok I see kinda. But why are we setting x = 0 again? and I found [tex]a_1 = 1, a_2 = 2, a_3 = 6, a_4 = 24, ..., a_n = n![/tex] So I am guessing the set of all these values constitute the independent set? Would this reasoning along with what you provided give me enough to write a proof?

JL
 
  • #11
how did you get the a values?
i could be very wrong, but from Dick's comments i thought you were trying to show all the a's must be identically zero... this would give you a contradiction with your initial assumption that the set of operators are linearly dependent... and actually show they are linearly independent
 
  • #12
Just a comment on something you said in the first post that I don't believe anyone has caught:
The second method involve find the bases of V. A professor said it was {[itex]1, x/1!, x^2/2!, ..., x^n/n![/itex]}. But that doesn't make sense to me. I thought the basis would be {[itex]0, x, x^2,... x^n[/itex]}.
You can never have 0 in a basis. 1, yes, but 0, no. Other than that, each vector (i.e., function) in your basis is a constant multiple of the corresponding member of the other basis.
 
  • #13
Thanks everyone, I solve this problem.


JL
 

Similar threads

  • · Replies 13 ·
Replies
13
Views
2K
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 24 ·
Replies
24
Views
4K
  • · Replies 7 ·
Replies
7
Views
1K
Replies
1
Views
2K
Replies
34
Views
4K
  • · Replies 10 ·
Replies
10
Views
2K
  • · Replies 26 ·
Replies
26
Views
3K
Replies
9
Views
2K