Analysis - solutions to differential equations

  • Thread starter Kate2010
  • Start date
  • #1
146
0

Homework Statement



Suppose that f:R->R is twice differentiable and that f''(x) + f(x) = 0, f(0)=0 and f'(0)=0
Prove that f'(x) = f(x) = 0 for all x

Homework Equations





The Attempt at a Solution



I can solve this using methods from calulus, using an auxillary equation and the boundary conditions. However, I am unsure how to go about it as a piece of pure maths. In examples in my notes I have needed to use the 'Identity Theorem', a corollary of the Mean Value Theorem, stating if f: (a,b) -> R is differentiable and satisfies f'(t) = 0 for all t in (a,b) then f is constant on (a,b). However, I am unsure whether this is the correct method in this case, and if it is, how to use it.

Thanks :)
 

Answers and Replies

  • #2
1,101
3
Consider the derivative of the function [itex]g = (f')^2 + f^2,[/itex] apply the "Identity Theorem", and use the initial conditions.
 
  • #3
146
0
Thanks! That makes a lot of sense.

The next part of the question is a more general version: If g is twice differentiable and satisfies h''(x) + h(x) = 0 prove that h(x) = Acosx +Bsinx
Using your advice, I can show h'(x)2+h(x)2 = constant
I see that this looks a bit like Pythagoras but am not sure how I would prove that it involves sin and cos.
 

Related Threads on Analysis - solutions to differential equations

  • Last Post
Replies
4
Views
1K
  • Last Post
Replies
1
Views
1K
  • Last Post
Replies
1
Views
1K
Replies
2
Views
1K
Replies
0
Views
2K
Replies
3
Views
959
Replies
21
Views
2K
Replies
10
Views
3K
Replies
4
Views
2K
Replies
1
Views
1K
Top