wayneckm
- 66
- 0
Hello all,
For a monotonic increasing/decreasing function f(x) on x \in \mathbb{R}, we can only have supremum/infimum which is occurred at x = \infty with value \lim_{x\uparrow \infty}f(x) Otherwise, if it was a maximum/minimum, it would violate the assumption of monotonicity.
Am I correct on the above statement?
For a monotonic increasing/decreasing function f(x) on x \in \mathbb{R}, we can only have supremum/infimum which is occurred at x = \infty with value \lim_{x\uparrow \infty}f(x) Otherwise, if it was a maximum/minimum, it would violate the assumption of monotonicity.
Am I correct on the above statement?