- #1
diligence
- 144
- 0
I can't seem to wrap my head around this concept, I'm hoping you can help me out. Suppose you have a continuous function defined on some compact subset of the plane, say {0 <= x <= 1, 0 <= y <= 1}. I guess the function could be either real or complex valued, but let's just say it's real so we don't have to worry about any funky complex business going on. Also, suppose the function vanishes on the bottom side of the square, {y=0, 0 <= x <= 1}
What is wrong with my following proof that the function must therefore vanish on the entire boundary:
Suppose it does not vanish on {x=0, 0 < y <=1 }. Then there exists an e > 0 such that |f(x,y)| > e for (x,y) in {x=0, 0<y<=1}. Now consider e/2. Continuity implies that as (x,y) approaches the origin along the y axis, there must exist a d>0 such that |(x,y) - (0,0)| < d implies |f(x,y) - f(0,0)| = |f(x,y)| < e/2. But we know that f is bigger than e on this portion of the y-axis, hence this can't be possible. Therefore, f is either not continuous or f also vanishes on the y-axis between 0 and 1. We can then use the same logic to say f vanishes on the entire boundary of the square.
This doesn't feel correct at all, but I can't figure out what's wrong with my proof. Is it because I assumed since f does not vanish that it must be bigger than some e > 0 ?
What is wrong with my following proof that the function must therefore vanish on the entire boundary:
Suppose it does not vanish on {x=0, 0 < y <=1 }. Then there exists an e > 0 such that |f(x,y)| > e for (x,y) in {x=0, 0<y<=1}. Now consider e/2. Continuity implies that as (x,y) approaches the origin along the y axis, there must exist a d>0 such that |(x,y) - (0,0)| < d implies |f(x,y) - f(0,0)| = |f(x,y)| < e/2. But we know that f is bigger than e on this portion of the y-axis, hence this can't be possible. Therefore, f is either not continuous or f also vanishes on the y-axis between 0 and 1. We can then use the same logic to say f vanishes on the entire boundary of the square.
This doesn't feel correct at all, but I can't figure out what's wrong with my proof. Is it because I assumed since f does not vanish that it must be bigger than some e > 0 ?