# Prove by Beta and Sigma limit definition

1. Aug 5, 2005

### kidia

Use the symbols $$\beta$$ and $$\sigma$$ definition of limit to prove that limit (x,y)$$\Longrightarrow$$(0,0)x+y/x$$\x^2$$+y$$\y^2$$=0

2. Aug 6, 2005

### EnumaElish

Last edited by a moderator: Apr 21, 2017
3. Aug 6, 2005

### rachmaninoff

I think beta/sigma definitions are a 2D analogue to epsilon/deltas? I'm afraid I can't find any reference to them, you'll have to define them please.

4. Aug 6, 2005

### quasar987

You want to show that for a given $\beta >0$, you can find a number $\sigma >0$ such that when the distance from the origin of the point (x,y) is smaller than $\sigma$, then this implies that

[tex]\frac{x+y}{x^2+y^2}<\beta[/itex]

So we kinda want to find a function $\sigma(\beta)$.

The statement "the distance from the origin of the point (x,y) is smaller than $\sigma$" is written mathematically as $\sqrt{x^2+y^2}<\sigma$

There are many solutions but here's a hint based on one:

Use the fact that $x+y \leq (x^2+y^2)^2$ coupled with the hypothesis $\sqrt{x^2+y^2}<\sigma$ to define a function $\sigma(\beta)$.