1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Analysis Question-differentiabillity, continuity

  1. Jul 16, 2012 #1
    Analysis Question--differentiabillity, continuity

    1. The problem statement, all variables and given/known data
    Suppose [itex]f:\mathbb{R}\to\mathbb{R}[/itex] is a [itex]C^\infty[/itex] function which satisfies the equation [itex]f''(x)=-x^2f(x)[/itex] along with [itex]f(0)=1[/itex], [itex]f'(0)=0[/itex]. Prove that there is an [itex]a>0[/itex] such that [itex]f(a)=0[/itex]. Do not use any results from differential equations. Thank you.


    2. Relevant equations



    3. The attempt at a solution
    Since f is continuously differentiable there is a [itex](0,\delta)[/itex] interval in which f is concave down. If we can show f is also decreasing then it follows that f must cross the x axis before changing concavity and increasing because there can be no cusps as f is differentiable everywhere. I have no idea if that is the right track. Thank you.
    1. The problem statement, all variables and given/known data



    2. Relevant equations



    3. The attempt at a solution
     
    Last edited by a moderator: Jul 16, 2012
  2. jcsd
  3. Jul 16, 2012 #2

    micromass

    User Avatar
    Staff Emeritus
    Science Advisor
    Education Advisor
    2016 Award

  4. Jul 18, 2012 #3
    Re: Analysis Question--differentiabillity, continuity

    Not true. A counterexample sin(x)+2 goes between convex and concave without crossing x.

    Perhaps it'd be easier to prove by contradiction.
    f(0)=1>0, by continuity, in the immediate neighborhood xE[0,c], f(x)>0. Let's (erroneously) assume f(x)>0 for all x>0. since f''(x)<0, f(x) concave down, it follows that
    f(x)<f(c)+f'(c)(x-c) for all x>0 and c>0
    You can prove f'(x)<0 for all x>0 (if the assumption is true), hence f'(x)=-|f'(x)|.

    Now, let x=ζ=c+f(c)/|f'(c)|, it leads to
    f(ζ)<0 where ζ>c>0. Now this contradicts with the assumption that f(x)>0 for all x>0.

    Not pretty, just to bounce some idea.
     
    Last edited: Jul 18, 2012
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Analysis Question-differentiabillity, continuity
Loading...