1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Homework Help: Linear Algebra - Linearly Independent Functions

  1. Nov 29, 2012 #1
    Forgive me ahead of time, I don't really know how to use LaTeX, (it's on my to do list).

    1. The problem statement, all variables and given/known data
    Given the vector space C([0,pi]) of continuous, real valued functions on the given interval, as well as the inner product <f,g>=integral(f(t)*g(t))dt from 0 to pi:

    a) Prove the set S={cost, sint, 1, t} is linearly independent.

    b) Apply the Gram-Schmidt process to S to get an orthonormal basis of Span(S)

    2. Relevant equations

    3. The attempt at a solution
    I get that (b) is easy, just a plug and chug.

    My question is about a. Of course, any set (and specifically this set for argument's sake) is linearly independent if and only if

    a*cost+b*sint+c*1+d*t=0, only for (a,b,c,d)=(0,0,0,0)

    Or equivalently, if any of the vectors can be expressed as a linear combination of the others. For example:


    So, I need the best way to prove they are linearly independent. Two ideas passed through my mind. The first (which I'm not positive is correct) is that the vector would be the sum of its projections on the other vectors. So

    a=<cost,sint>/<sint,sint> * sint
    b=<cost,1>/<1,1> * 1


    And the other idea was to evaluate the functions and different values of t, and prove that a,b, and c could not possible be constants, and therefore the functions are not linear combinations of each other.

    Are either of these the right way to go about this?

    Thanks for your help ahead of time, and sorry about for my lack of latex knowledge.
  2. jcsd
  3. Nov 29, 2012 #2


    User Avatar
    Staff Emeritus
    Science Advisor
    Homework Helper
    Education Advisor

    Your second method is good.
  4. Nov 29, 2012 #3
    So then the first one is bad?

    Also, would I need to do it for all vectors? Or would it suffice to show it for just one?

    Last edited: Nov 29, 2012
  5. Nov 29, 2012 #4


    User Avatar
    Staff Emeritus
    Science Advisor
    Homework Helper
    Education Advisor

    The first method is problematic because the vectors aren't orthogonal. You can't assume that a vector would be the sum of its projections onto the basis vectors.
  6. Nov 29, 2012 #5
    Ah, that makes sense. I thought it was right because in my head i was doing examples in my head in R3, so for example (1,1,1) is the sum of its projections on the standard basis. But then I realized that didn't work in reverse.

  7. Nov 30, 2012 #6


    Staff: Mentor

    The second part is NOT equivalent to the first. If any vector can be expressed as a linear combination of the others, the set is linearly dependent, not linearly independent.
    If for all values of t, this equation has a nontrivial solution for the constants a, b, and c (not all zero), then the set {cos(t), sin(t), 1, t} is linearly dependent.
  8. Nov 30, 2012 #7
    Gah, the first part was just a slip of the tongue. It was supposed to say cannot be expressed.
  9. Nov 30, 2012 #8


    User Avatar
    Science Advisor

    No, not "equivalently". The first is, as you say "linearly independent" but the second characterizes "linearly dependent"!

    That would work but is an awful lot of unnecessary computation.

    Yes, that would work. And since you have only three "unknowns", a, b, and c, you need only three equations and so only three different values of t.
    a cos(t)+ bsin(t)+ c= 0
    Taking t= 0, that becomes a+ c= 0.
    Taking [itex]t= \pi/2[/itex], that becomes b+ c= 0.
    Taking [itex]t= \pi[/itex], that becomes -a+ c= 0.

Share this great discussion with others via Reddit, Google+, Twitter, or Facebook