Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

I Linear (in)dependency

  1. Oct 20, 2016 #1

    Math_QED

    User Avatar
    Homework Helper

    Hello all.

    I have a question about linear dependency.

    Suppose we have a set ##S## of functions defined on ##\mathbb{R}##.

    ##S = \{e^x, x^2\}##. It seems very intuitive that this set is linear independent. But, we did something in class I'm unsure about.

    Proof:

    Let ##\alpha, \beta \in \mathbb{R}##.
    Suppose ##\alpha e^x + \beta x^2 = 0##
    We need to show that ##\alpha = \beta = 0##

    (Here comes the part I'm unsure about)

    Let ##x = 0##, then ##\alpha e^0 + \beta 0^2 = 0##
    ##\Rightarrow \alpha = 0##

    But if ##\alpha = 0## then follows that ##\beta = 0##.
    So ##S## is linear independent.

    My actual question:

    Why can we conclude that the set is linear independent, just by saying that ##x = 0## makes it work? Shouldn't we show that it works for all ##x \in \mathbb{R}##?

    Thanks in advance.
     
  2. jcsd
  3. Oct 20, 2016 #2

    fresh_42

    Staff: Mentor

    We can't. The conclusion is derived from ##\alpha = 0##, not from ##x=0##.
    Yes. This is the crucial point. The equation ##\alpha e^x + \beta x^2 = 0## has to hold for all ##x##, so especially for ##x=0##.
    And if already ##x=0## imply ##\alpha = \beta = 0##, what chances are there for other values of ##x##? The coefficients do not depend on ##x##!
     
  4. Oct 20, 2016 #3

    Math_QED

    User Avatar
    Homework Helper

    So we can conclude this because the coefficients do not depend on ##x##? From what I understood it mist hold for all x, so certainly for ##x = 0##? I still don't fully understand I think.

    To complicate things even further, let me suppose that we consider these functions on the domain ##\mathbb{R}_0##, how do we show the linear dependency then?
     
  5. Oct 20, 2016 #4

    fresh_42

    Staff: Mentor

    Yes.
    Yes.
    True for all ##x## implies true for a certain ##x## as well, and everything derived from a single instance has to be true. It might not be sufficient to hold for all ##x##, but it is necessary. And if something is wrong for one, it cannot be true for all.
    What do you mean by ##\mathbb{R}_0##? ##\mathbb{R} - \{0\}##?
    If we have a ##0##, then the method above can be used.
    If we don't have a ##0##, we have to do some more work. E.g. by solving the system ##\alpha e^x + \beta x^2 = 0 ## for values ##x \in \{1,2,-1,-2\}##. (I haven't done it, I simply listed enough values to be sure the system can only hold for ##\alpha = \beta = 0##.)

    The domain where the coefficients ##\alpha \, , \, \beta## are taken from is essential.
    Until now we discussed linear independence over ##\mathbb{Q}\, , \,\mathbb{R}## or ##\mathbb{C}##.
    However, the two functions are not linear independent if we allowed the coefficients to be functions themselves.
    We could get ##\alpha(x) e^x + \beta (x) x^2 = 0## with ##\alpha(x) = -x^2 \neq 0## and ##\beta(x) = e^x \neq 0##.

    Let me cheat here a little bit, because I don't want to think about the question, in which coefficient domain this could be done, that is also a field. So let us consider quotients of rational polynomials in one variable instead, which is a field. (The exponential function complicates things here.)
    Let us further take ##S=\{x,x^2\}##.
    Then ##\alpha x + \beta x^2 = 0 \Longrightarrow \alpha = \beta = 0## if ##\alpha \, , \, \beta \in \mathbb{Q}##.
    But ##\alpha x + \beta x^2 = 0 \nRightarrow \alpha = \beta = 0## if ##\alpha \, , \, \beta \in \mathbb{Q}(x)##.
    In this case we have an equation ## \alpha x + \beta x^2 = 0## where we can choose ##\alpha = -x \neq 0## and ##\beta = 1 \neq 0##.
    So the elements of ##S## are linear independent over ##\mathbb{Q}##, but linear dependent over ##\mathbb{Q}(x)##.
     
    Last edited: Oct 21, 2016
  6. Oct 20, 2016 #5

    Mark44

    Staff: Mentor

    No, that's an incomplete summary of what you need to show. Suppose that your set is {x, 2x}.
    Suppose ##\alpha x + \beta 2x = 0##
    Then ##\alpha = 0## and ##\beta = 0## clearly work.

    From this one might mistakenly conclude that the functions x and 2x are linearly independent, which is not true.
    What you left out from "We need to show that ##\alpha = \beta = 0##" is that there can be no other solutions for these constants.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: Linear (in)dependency
  1. Linear (in)dependence (Replies: 4)

  2. Linearly dependent (Replies: 2)

  3. Linear dependence (Replies: 3)

  4. Linear Dependence (Replies: 5)

  5. Linear DEpendence (Replies: 3)

Loading...