MHB How Do You Find the Slant Asymptote of \( y = \frac{x}{2} - \tan^{-1}x \)?

  • Thread starter Thread starter ISITIEIW
  • Start date Start date
  • Tags Tags
    Asymptote
Click For Summary
To find the slant asymptote of the function \( y = \frac{x}{2} - \tan^{-1}x \), the derivative is analyzed as \( x \to \pm\infty \), yielding \( \lim_{x\to\pm\infty} \frac{dy}{dx} = \frac{1}{2} \). This indicates the oblique asymptote takes the form \( y = \frac{1}{2}x + b \). To determine \( b \), the limit of the difference between the function and the asymptote must approach zero, leading to \( b = \pm\frac{\pi}{2} \). Consequently, the slant asymptotes are \( y = \frac{x \pm \pi}{2} \).
ISITIEIW
Messages
17
Reaction score
0
Hey!
I know how to find slant asymptotes of regular rational functions, but what happens when the function is $y= \frac{x}{2} - \tan^{-1}x$ ?
Is there a special way to do this? I know what the $\arctan x$ function looks like and that is $y\in(-\frac{\pi}{2},\,\frac{\pi}{2})$ and it is $x\in\mathbb{R}$. The answer is (x-pi)/2

Thanks!
 
Last edited:
Physics news on Phys.org
This is how I would work the problem:

First, let's analyze the derivative of the function as $x\to\pm\infty$:

$$\lim_{x\to\pm\infty}\frac{dy}{dx}= \lim_{x\to\pm\infty}\left(\frac{1}{2}-\frac{1}{x^2+1} \right)= \frac{1}{2}$$

Hence, the oblique asymptote will have the form:

$$y=\frac{1}{2}x+b$$

Now, we require the difference between this asymptote and the function to diminish to zero as $x\to\pm\infty$, so we may write:

$$\lim_{x\to\pm\infty}\left(\frac{1}{2}x-\tan^{-1}(x)-\frac{1}{2}x-b \right)=0$$

$$\lim_{x\to\pm\infty}\left(\tan^{-1}(x)+b \right)=0$$

$$b=\pm\frac{\pi}{2}$$

Thus, the asymptotes must be:

$$y=\frac{x\pm\pi}{2}$$
 
There are probably loads of proofs of this online, but I do not want to cheat. Here is my attempt: Convexity says that $$f(\lambda a + (1-\lambda)b) \leq \lambda f(a) + (1-\lambda) f(b)$$ $$f(b + \lambda(a-b)) \leq f(b) + \lambda (f(a) - f(b))$$ We know from the intermediate value theorem that there exists a ##c \in (b,a)## such that $$\frac{f(a) - f(b)}{a-b} = f'(c).$$ Hence $$f(b + \lambda(a-b)) \leq f(b) + \lambda (a - b) f'(c))$$ $$\frac{f(b + \lambda(a-b)) - f(b)}{\lambda(a-b)}...

Similar threads

  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 8 ·
Replies
8
Views
740
  • · Replies 4 ·
Replies
4
Views
4K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 17 ·
Replies
17
Views
3K
Replies
3
Views
1K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 4 ·
Replies
4
Views
1K
  • · Replies 5 ·
Replies
5
Views
3K