1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Homework Help: Linear Independence of Two Functions

  1. Jun 24, 2017 #1

    Drakkith

    User Avatar
    Staff Emeritus
    Science Advisor

    1. The problem statement, all variables and given/known data
    Use definition (1) to determine if the functions ##y_1## and ##y_2## are linearly dependent on the interval (0,1).
    ##y_1(t)=cos(t)sin(t)##
    ##y_2(t)=sin(t)##

    2. Relevant equations
    (1) A pair of functions is said to be linearly independent on the interval ##I## if and only if neither of them is a constant multiple of the other on all of ##I##.

    3. The attempt at a solution
    My first thought was to put an ##x## in front of one of the equations, set them equal to each other, and solve for ##x##. That left me with ##x=cos(t)## and ##x=sec(t)##.
    My thought was that since these functions change value as ##t## changes, the two original functions are linearly independent. But that appears to be wrong.

    Apparently my understanding of what definition (1) means is incorrect. Could someone enlighten me?
     
  2. jcsd
  3. Jun 24, 2017 #2

    fresh_42

    User Avatar
    2017 Award

    Staff: Mentor

    I read (and know) the definition as ##\{y_1,y_2\}## is linear dependent, iff there is a ##c \in \mathbb{R}## such that ##y_1(t) = c \cdot y_2(t)## for all ##t \in I## or the other way around (in case ##y_2 \equiv 0##). This means linear (in)dependency over ##\mathbb{R}##.

    Neither ##\cos(t)## nor ##\sin(t)## has zeroes in ##I##, so ##y_1(t)=\cos(t) \cdot y_2(t)## and ##\cos(t) \not\equiv c##, so they are linearly independent. Do I miss something? Why do you think it's wrong? Or is it about linearity over a field of functions?
     
  4. Jun 24, 2017 #3

    Drakkith

    User Avatar
    Staff Emeritus
    Science Advisor

    I'm sorry, Fresh. Apparently ##y_2=sin(2t)## not ##sin(t)##...
    Solving for x now yields ##x=1/2##, which means they are linearly dependent, right?
     
  5. Jun 24, 2017 #4

    vela

    User Avatar
    Staff Emeritus
    Science Advisor
    Homework Helper
    Education Advisor

    Ayuh
     
  6. Jun 24, 2017 #5

    fresh_42

    User Avatar
    2017 Award

    Staff: Mentor

    ... if I knew the formula by heart ... wait ... yes. Or ##2## if you take the other direction.
    And in case we consider the field of all continuous and bounded functions on ##I## which don't have zeroes in ##I##, then it would even be linear dependent in the first version.
     
  7. Jun 24, 2017 #6

    Drakkith

    User Avatar
    Staff Emeritus
    Science Advisor

    Gesundheit.
     
  8. Jun 24, 2017 #7

    Drakkith

    User Avatar
    Staff Emeritus
    Science Advisor

    We're not looking for integer values for ##x## here are we?
     
  9. Jun 24, 2017 #8

    fresh_42

    User Avatar
    2017 Award

    Staff: Mentor

    No, just elements of ##I=(0,1)##. The remark shall stress that linear dependency has always to be said above what. Here above the reals.

    If we consider continuous and bounded functions on ##I## which do not have a zero, then these functions form a field.
    We then have to define ##(y_1 \cdot y_2) \, : \,t \longmapsto y_1(t)\cdot y_2(t)## as multiplication and this gets us a field ##F##, which can serve as a scalar field of vector spaces, e.g. a space of certain other functions.
    Now in this field, ##F \ni \lambda_t = \cos(t)## is an element, which makes it a scalar and ##y_1(t)= \cos(t)\cdot \sin(t) = \lambda_t \cdot y_2(t)## is a linear dependency. So if we change the scalar field of a vector space, linear independency can turn into linear dependency.

    An easier example is that ##\{1,i\}## are linear independent above ##\mathbb{R}## but linear dependent above ##\mathbb{C}##.
     
  10. Jun 24, 2017 #9

    Mark44

    Staff: Mentor

    ##y_1(t) = \cos(t)\sin(t)## and ##y_2(t) = \sin(2t)## are linearly dependent. Each function is some constant multiple of the other.I'm assuming that the revised version of ##y_2(t)## is the correct function in this problem.

    BTW, it's easy to check two functions or two vectors for linear dependence/independence, but it's a bit more complication when there are three or more vectors/functions. With three vectors or functions, it's possible that no two of them are multiples of each other, but the set of vectors/functions can still be linearly dependent.
     
  11. Jun 24, 2017 #10

    Drakkith

    User Avatar
    Staff Emeritus
    Science Advisor

    That's right.

    Thanks Mark. Any idea if I'll get into that in an introduction to differential equations class?
     
  12. Jun 24, 2017 #11

    Drakkith

    User Avatar
    Staff Emeritus
    Science Advisor

    You're a little above my knowledge level, Fresh. I'm just an undergrad taking Introduction to Differential Equations (Math 254). I've never even seen most of what you just said in a math class before.
     
  13. Jun 24, 2017 #12

    fresh_42

    User Avatar
    2017 Award

    Staff: Mentor

    It's basically the same as with the good old arrows. If you have two, then linear independency means they point in two different directions, and if they are linear dependent, they point in the same (or exactly opposite) direction, i.e. one is a multiple of the other. If we have three vectors, they can form a coordinate system in normal space, which means they are linear independent, or the only define a plane or a straight, in which case they are linear dependent as we cannot get the entire space from them.

    All I wanted to say is, that one has to mention where the multiples are allowed to be from. ##\{1,i\}## point in two directions if we allow multiples from the reals, but in the same direction, if we allow complex multiples, because ##i = i \cdot 1##, as in your definition (1). This happens because we cannot really draw complex numbers other than in two real dimensions. But in itself, it is a one dimensional vector space, just a complex one.
     
  14. Jun 24, 2017 #13

    Drakkith

    User Avatar
    Staff Emeritus
    Science Advisor

    That mostly makes sense. I understand that a vector is a multiple of another vector if you can multiply it by a scalar and have them equal each other, which requires that they point in the same direction or in opposite directions, right?

    I can't say I understand this I'm afraid. It's been a while since I had to do anything with vectors and I haven't worked with complex vectors yet.
     
  15. Jun 24, 2017 #14

    Mark44

    Staff: Mentor

    Your should. When you get into finding the solution to 2nd order DEs, there will generally be an infinite number of solutions, but they are linear combinations of two basic solutions that are a pair of linearly independent functions. These functions "span" the solution space. You probably don't understand these ideas just yet, but they show up in both differential equations and linear algebra. There's a large overlap between these two areas of mathematics.

    Correct, if you're talking about linearly dependent vectors. The situation is the same with linearly dependent functions -- i.e., each one is some nonzero multiple of the other.
     
  16. Jun 24, 2017 #15

    fresh_42

    User Avatar
    2017 Award

    Staff: Mentor

    Yes. The opposite direction comes into play as we can multiply with negative numbers, which reverses the direction.
    One can also consider the reals as an infinite dimensional vector space over the rationals. ##\pi## and ##e## or ##\sqrt{2}## are not multiples of each other, if we only allow rationals as base field, as the reservoir for multiples. In this sense and according to definition (1), they are linearly independent over ##\mathbb{Q}##. But if we allow real multiples in the definition (1), i.e. real stretches and compression, then they become multiples of one another. So it is important where allowed multiples are from.
     
  17. Jun 25, 2017 #16

    WWGD

    User Avatar
    Science Advisor
    Gold Member

    A good point to think about is on whether ## c_1sin(2t)+c_2sintcost=0 ## . But notice that the zero on the right is the 0 _function_ , not just the number 0; remember we are adding combinations of functions, so we get a function, the function that is identically 0 , i.e., ## f: f(x)=0 \ for all x ##
     
  18. Jun 27, 2017 #17
    Excellent post. Are you aware of the definition of Linear Independence Drakk? Or what it means? Maybe I can help clarify a few things.
    When I was first learning these ideas, I had a hard time understanding what Linear Indepence, Basis, etc was.
    I read Serge Lang: Introduction to Linear Algebra and I finally understood these ideas. I recommend to have this book on your book shelf.
     
  19. Jun 27, 2017 #18

    Drakkith

    User Avatar
    Staff Emeritus
    Science Advisor

    Only a very basic idea regarding two functions not being constant multiples of each other, and only in the context of two functions at a time. Most of the math notation and terminology above is just too far beyond my knowledge level right now.
     
  20. Jun 27, 2017 #19
    Definition of linear independence:

    Let V be a vector space, and let v1,...,vn (these means n vectors) be elements of of V. Let a1,...,an be numbers such that
    a1v1+...+anvn=0 (1)
    then ai=0 for all i=1,...,n.

    Note: That 1,n, and i should be subscripts (need to relearn latex). This is an easier definition to understand.

    The main idea is that if vectors are linearly independent, the only way to make (1) true, is if the number themselves are all equal to 0.

    You cannot write these vectors as linear combinations of one another if they are linearly independent.

    Let me know if this makes sense. I can explain further.
     
  21. Jun 27, 2017 #20

    Mark44

    Staff: Mentor

    There is some subtlety here that escapes many students. For example, consider ##\vec{u} = <1, 2>## and ##\vec{v} = <2, 4>##.
    I notice that ##a_1<1, 2> + a_2<2, 4> = 0## when ##a_1 = 0## and ##a_2 = 0##, so I conclude (wrongly) that ##\vec u## and ##\vec v## are linearly independent. (I am repeating the reasoning that I've seen many students display.)

    Whether a set of vectors is linearly dependent or linearly independent, we can always write the equation ##a_1\vec{v_1} + a_2\vec{v_2} + \dots + a_n\vec{v_n} = \vec 0##. For a linearly independent set, there is only one solution to this equation: ##a_1 = a_2 = \dots = a_n = 0##; i.e., the trivial solution.

    This would take less than one minute to learn. See https://www.physicsforums.com/help/latexhelp/, in the section titled Superscripts and subscripts.

    @Drakkith, we're talking about linear dependence/independence of vectors here. The situation is almost exactly the same for linear dependence/indepence of functions.
     
  22. Jun 27, 2017 #21

    fresh_42

    User Avatar
    2017 Award

    Staff: Mentor

    There is another subtlety here in case of two vectors. If one vector is zero, then it always leads to linearly dependent systems, as the there is always a non-trivial solution of the equation with the coefficient in front of the zero vector being arbitrary. However, the word neither in definition (1) becomes important: Let ##y_2= 0##. Then ##\{y_1,0\}## is linear dependent, although ##0 \neq y_1 \neq c \cdot 0 = c \cdot y_2## whatever we choose ##c## to be, whereas ##0 = 1\cdot y_2 = c \cdot y_1 = 0 \cdot y_1## satisfies the condition. With respect to this circumstance, the defining equation ##c_1y_1+c_2y_2=0## has to be preferred over the special version in definition (1), because it covers this special case and avoids the need to mention the symmetry.
     
  23. Jun 27, 2017 #22

    Drakkith

    User Avatar
    Staff Emeritus
    Science Advisor

    So given my two functions above, we have ##(1):## ##A_1cos(t)sin(t)+A_2sin(2t)=0##. When I graph the functions, they become identical when ##A_1## and ##A_2## are both 0, when ##A_1=2## and ##A_2=1##, or when ##A_1=1## and ##A_2=1/2##. Actually it appears they become identical when ##\frac{A_1}{A_2}=2/1##. Linearly independent functions would only be solutions to ##(1)## when ##A_1=0## and ##A_2 = 0## correct?
     
  24. Jun 27, 2017 #23

    WWGD

    User Avatar
    Science Advisor
    Gold Member

    Correct, but, don't you mean ## A_2 sin(2t) ##?
     
  25. Jun 27, 2017 #24

    Drakkith

    User Avatar
    Staff Emeritus
    Science Advisor

    Yes I do! I even messed that up on my graph and then forgot to correct my post!
    Edit: Post corrected.
     
  26. Jun 27, 2017 #25
    Thanks mark. I forgot to mention that. "Only the trivial solution exist." If other solutions including the trivial solution exist, then the vectors are linearly dependent.

    @drakk. If you want to understand what those symbols mean:

    You can read a short book on set theory. A finer way, and I believe more useful, would be to read Hubbard Hubbard: Vector Calculus, Linear Algebra, and Differential Gor
    Thanks Mark! I forgot to mention that if the only solution is the trivial solution, then the vectors are linearly independent. If there exist other solutions besides the trivial solution, the vectors are linearly dependent.

    @Dark Light
    Later or in your DE class, you will another method for checking if 2 or more functions are linearly independent. It's called the Wronskian and you take the determinant of it. Ofcourse, there is a limitation to this method or two. It involves a set of... being differential n-1 times, where n is the number of elements in the set, and that when determinant of this set is equal to 0, we cannot determine whether...
     
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Loading...