Is f'(x) defined at x=0 for the given piecewise function?

  • Context: Undergrad 
  • Thread starter Thread starter Robokapp
  • Start date Start date
Click For Summary
SUMMARY

The piecewise function f(x) = sin(x) for x ≤ 0 and f(x) = x for x > 0 is continuous at x = 0, as demonstrated by the limits from both sides equating to f(0). The derivative f'(x) exists at x = 0, as both the left-hand derivative and right-hand derivative equal 1. The approach of verifying the equality of the left and right derivatives is a valid method to establish differentiability at that point. The discussion emphasizes the importance of analyzing piecewise functions by separately evaluating their derivatives at the boundaries.

PREREQUISITES
  • Understanding of piecewise functions
  • Knowledge of limits and continuity
  • Familiarity with derivatives and their definitions
  • Basic calculus concepts such as left-hand and right-hand derivatives
NEXT STEPS
  • Study the concept of limits in depth, particularly for piecewise functions
  • Learn how to compute left-hand and right-hand derivatives
  • Explore the implications of differentiability in calculus
  • Review examples of continuity and differentiability in piecewise functions
USEFUL FOR

Students studying calculus, particularly those grappling with piecewise functions and their differentiability, as well as educators seeking to clarify these concepts for their students.

Robokapp
Messages
218
Reaction score
0
It's a question I had on a quiz a few minutes ago.

f(x) = sin(x) for x < or = 0 and
f(x) = x for x > 0

Question was...does f'(x) exist? what is it?

-------------

First I proved the f(x) is continuous at 0 by stating limit as x->0 from left = limit as x->0 from right = f(0).

Then...taking the derivative turned out to be a problem.
the lim[f(x+h)-f(x)]/h was unclear because...if x=0 and you add some h to it, that'd mean you use as your f(x) the x...but point (0, 0) belongs to Sin(x) not to x.

What i did is state that in x-values close to zero, from left and from right, the f'(x) = 1. Basically I ruled out situations like |x| where dy/dx = -1 and 1 for values around x=0.

I assumed that if f'(-0.0001) and f'(0.0001) are equal, the function (who we know is continuous) must be differentiable at that point.

---------------

But my question is...is my logic something that will make my teacher grab his hairs before giving me a nice zero? or a 'wise' way to look the issue?

Also, is there a better, clearer way to solve this?

I always experience uncertainty on what to do and to which expression when a function gets split according to Domain.

Thank you.

~Robokapp
 
Physics news on Phys.org
Split your argument in two:
1. Find the right-hand derivative
2. Find the left-hand derivative

If they are equal, then your function is differentiable at 0.
 
f is trivially differentiable for x>0 and x<0. At x=0, one merely needs to verify if the left and right derivatives exist and agree. They trivially exist and as cos(x) tends to 1 as x tends to zero they agree.
 

Similar threads

  • · Replies 8 ·
Replies
8
Views
1K
  • · Replies 17 ·
Replies
17
Views
4K
  • · Replies 9 ·
Replies
9
Views
3K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 2 ·
Replies
2
Views
4K
  • · Replies 25 ·
Replies
25
Views
4K