Linear Algebra (Vector spaces, linear independent subsets, transformations)

In summary: I'm still confused.The goal is to show a contradiction that the linear combination of the T's is not zero. So you're trying to find a polynomial that when multiplied by the T's gives zero. But if p(x) is defined by P(R), then by assumption p(x) is also in L(V). So when you evaluate the polynomial at x=0, you should get something that is also in L(V), in this case 1. But that doesn't seem to contradict the assumption that the T's are linearly dependent.
  • #1
jeff1evesque
312
0
Assignment question:
Let [tex]V = P (R)[/tex] and for j >= 1 define [tex]T_j(f(x)) = f^j (x)[/tex]
where [tex]f^j(x)[/tex] is the jth derivative of f(x). Prove that the
set {[tex]T_1, T_2,..., T_n [/tex]} is a linearly independent subset of [tex] L(V)[/tex]
for any positive integer n.

I have no idea how V= P(R) has anything to do with the rest of the problem- in particular with the transformation [tex]T_j(f(x)) = f^j (x)[/tex]. I guess I just don't understand how V ties in with the definition of the transformation T.

2. Relevant ideas:
I heard two different professors each saying a different method of proving this problem. One said it was possible to prove it by contradiction, and another tried to help me prove it directly.

So when I started the method of contradiction I had:
Let B = {[tex] a_1T_1, a_2T_2, ..., a_nT_n [/tex]}
Assume B is linearly dependent (goal: show a contradiction that its dependent?)
Choose [tex]T_i[/tex] from B such that 1<= i <= n.
Therefore, [tex]T_i = B - a_iT_i[/tex]

From here I couldn't continue. I have to show that the linear combination above (to the right of the equality) takes elements to the same ith derivative- and derive a contradiction here?
----------------

The second method involve find the bases of V. A professor said it was {[tex]1, x/1!, x^2/2!, ..., x^n/n![/tex]}. But that doesn't make sense to me. I thought the basis would be {[tex]0, x, x^2,... x^n[/tex]}.
 
Last edited:
Physics news on Phys.org
  • #2
I'm going to assume that P(R) is the vector space of polynomials with real coefficients under regular addition and scalar multiplication. If so, what do you mean by L(V)?
 
  • #3
If the T's are linearly dependent then there is a linear combination a1*T1+a2*T2+...an*Tn that gives you zero when applied to any polynomial p(x), right? Ok, then apply that to specific polynomials and evaluate the result at x=0. Say p(x)=x. T1(x)=1, T2(x)=0, ... What does that tell you about a1? Suppose p(x)=x^2. T1(x^2)=2x, T2(x^2)=2, T2(x^2)=0. ... Now set x=0. What does that tell you about a2? Etc. Etc.
 
  • #4
foxjwill said:
I'm going to assume that P(R) is the vector space of polynomials with real coefficients under regular addition and scalar multiplication. If so, what do you mean by L(V)?

V=P(R). I think L(V) is linear transformations from V->V.
 
  • #5
foxjwill said:
I'm going to assume that P(R) is the vector space of polynomials with real coefficients under regular addition and scalar multiplication. If so, what do you mean by L(V)?

My notes from class says "it is the set of linear operations on V".
 
  • #6
Dick said:
If the T's are linearly dependent then there is a linear combination a1*T1+a2*T2+...an*Tn that gives you zero when applied to any polynomial p(x), right? Ok, then apply that to specific polynomials and evaluate the result at x=0. Say p(x)=x. T1(x)=1, T2(x)=0, ... What does that tell you about a1? Suppose p(x)=x^2. T1(x^2)=2x, T2(x^2)=2, T2(x^2)=0. ... Now set x=0. What does that tell you about a2? Etc. Etc.

Dick, you are saying the exact same thing one of my professors was saying to me. However I don't understand what you mean "that give you zero when applied to any polynomial p(x)...


Thanks,


JL
 
  • #7
They are operators in L(V). If they are linearly dependent then there's a linear combination of those operators that is IDENTICALLY zero. I.e. the combination of those operators applied to anything in the space is zero. That's what linearly dependent means. If you can show any combination must be nonzero someplace, then you've shown they are linearly independent.
 
Last edited:
  • #8
Dick said:
They are operators in L(V). If they are linearly dependent then there's a linear combination of those operators that is IDENTICALLY zero. I.e. the combination of those operators applied to anything in the space is zero. That's what linearly dependent means. If you can show any combination must be nonzero someplace, then you've shown they are linearly independent.

Oh so p(x) is defined by P(R), where V = P(R)? But how do you know p(x) = x, how do you know the domain equals the image?

Thanks for the help- it takes me awhile to learn these things,


JL
 
  • #9
S'ok. I don't know p(x)=x. p(x) could be anything. I'm just picking examples for p(x) that I can use to show the various a's must be zero.
 
  • #10
Dick said:
S'ok. I don't know p(x)=x. p(x) could be anything. I'm just picking examples for p(x) that I can use to show the various a's must be zero.

Oh ok I see kinda. But why are we setting x = 0 again? and I found [tex]a_1 = 1, a_2 = 2, a_3 = 6, a_4 = 24, ..., a_n = n![/tex] So I am guessing the set of all these values constitute the independent set? Would this reasoning along with what you provided give me enough to write a proof?

JL
 
  • #11
how did you get the a values?
i could be very wrong, but from Dick's comments i thought you were trying to show all the a's must be identically zero... this would give you a contradiction with your initial assumption that the set of operators are linearly dependent... and actually show they are linearly independent
 
  • #12
Just a comment on something you said in the first post that I don't believe anyone has caught:
The second method involve find the bases of V. A professor said it was {[itex]1, x/1!, x^2/2!, ..., x^n/n![/itex]}. But that doesn't make sense to me. I thought the basis would be {[itex]0, x, x^2,... x^n[/itex]}.
You can never have 0 in a basis. 1, yes, but 0, no. Other than that, each vector (i.e., function) in your basis is a constant multiple of the corresponding member of the other basis.
 
  • #13
Thanks everyone, I solve this problem.


JL
 

1. What is a vector space?

A vector space is a mathematical structure that consists of a set of vectors and a set of operations defined on those vectors, such as addition and scalar multiplication. These operations must follow certain properties, such as closure and associativity, in order for the set to be considered a vector space.

2. How do you determine if a set of vectors is linearly independent?

A set of vectors is linearly independent if none of the vectors in the set can be expressed as a linear combination of the other vectors. In other words, no vector in the set can be written as a combination of the other vectors multiplied by some scalars. This can be determined by using the concept of linear dependence and solving a system of equations.

3. What is a transformation in linear algebra?

A transformation in linear algebra is a function that maps vectors from one vector space to another. This can be thought of as a way to "transform" the vectors from one space to another while preserving certain properties, such as linearity. Examples of transformations include rotations, reflections, and dilations.

4. How do you find the inverse of a linear transformation?

To find the inverse of a linear transformation, you can use the concept of an inverse function. This involves finding the function that "undoes" the original transformation. In linear algebra, the inverse of a linear transformation is also a linear transformation and can be found using techniques such as the inverse matrix method or the Gauss-Jordan elimination method.

5. What is the importance of linear algebra in science?

Linear algebra is a foundational topic in mathematics that has many applications in science. It is used in fields such as physics, engineering, computer science, and economics to model and solve real-world problems. It also provides a framework for understanding and analyzing systems that involve multiple variables and relationships.

Similar threads

  • Calculus and Beyond Homework Help
Replies
1
Views
167
  • Calculus and Beyond Homework Help
Replies
0
Views
418
  • Calculus and Beyond Homework Help
Replies
24
Views
672
  • Calculus and Beyond Homework Help
Replies
7
Views
338
  • Calculus and Beyond Homework Help
Replies
1
Views
455
  • Calculus and Beyond Homework Help
Replies
34
Views
2K
  • Calculus and Beyond Homework Help
Replies
9
Views
1K
  • Calculus and Beyond Homework Help
Replies
26
Views
2K
  • Calculus and Beyond Homework Help
Replies
8
Views
558
  • Calculus and Beyond Homework Help
Replies
14
Views
531
Back
Top