# Does this condition imply f:R^2->R is continuous?

1. Jul 27, 2012

### Tinyboss

Here's an interesting question--I've asked some faculty members around here and "off the top of their head" none of them knows the answer. My gut says "yes", but my gut sucks at math. So here's the statement:

Suppose we have a function $f:\mathbb{R}^2\to\mathbb{R}$, with the property that for every line segment $L\subset\mathbb{R}^2$, the restriction $f\big|_L$ is continuous. Is $f$ necessarily continuous?

2. Jul 27, 2012

My gut says no, for the following reason. In order for the limit of f(x,y) as (x,y) goes to (a,b) to exist, we must have that for any continuous curve C on the plane, the limit of f(x,y) as (x,y) goes to (a,b) along C must exist and must have the same value for all curves C. However, there are functions for which the limit of f(x,y) exists along every line passing through (a,b), but the limit still doesn't exist. For instance the limit of $\frac{x^2y}{x^4+y^2}[\itex] as (x,y) goes to (0,0) along any line passing through the origin is 0, but the limit of the function along the curve y=x^2 is .5. So I would reason that continuity of the function along every line doesn't even suffice to guarantee existence of the limit at even a single point, so it doesn't suffice to guarantee continuity. 3. Jul 27, 2012 ### Robert1986 So. I think this might work (but I'm leaving soon so I don't have time to really think; I just thought I'd post an initial reaction and if other people see improvements or downright errors, then they can let me know). I think I also just found a better idea, if this does make sense, but I am about to leave so I don't have time to write it. Anyway, my answer is "yes." Basicly, I say that if [itex]x_n$ is a sequence approaching the origin then at some point, since $f\big|_L$ is continuous for each $L$ through the origin, then all of the $f(x_n)$ will be close to $f(0)$. (That is, $f(x_n) \to f(0)$. This only explicitly proves it is continuous at $0$ but (if this is correct) it is easily extended to all $x \in \mathbb{R}^2$

Let $\{x_n\}$ be a sequence of elements that converges to $0$ Where $0$ is the zero vector in $\mathbb{R}^2$. Let $\Lambda$ be the collection of lines through $0$. Let the elements of this set be $L_\theta$ with $\theta$ being the rotation from the positive $x$ axis.

Now, for each real number $r$, consider the set $\{f(x_{r,L_\theta}):x_{r,L_\theta} \in L_\theta, |x| = r\}$.

Let $k_M=\sup\{|f(x_{r,L_\theta}) - f(0)|:|x_{r,L_\theta}|=r, x_{r,L_\theta} \in L_\theta \}$.

Now, let $\epsilon > 0$. I want to show that there is an $N$ such that for each $n \geq N$ then $|f(x_n) - f(0)| < \epsilon$.

Now, since $f\big|_{L_\theta}$ is continuous, there exists an $r$ such that $k_M < \epsilon$ (since $|f\big|_{L_\theta} (x_{r,\theta}) - f(0)| < \epsilon$ for $r$ close enough to $0$). Since $x_n \to 0$ there is a $N$ such that $|x_n| < r$ for each $n \geq N$.

Thus, for all $n \geq N$, $|f(x_n) - f(0)| < \epsilon$ and so $f$ is continuous at $0$.

EDIT:
I think $k_M$ might be a problem; can these be defined?

Last edited: Jul 27, 2012
4. Jul 27, 2012

### micromass

I personally think the proposition is false. I have an idea for a counterexample, but I'm trying to rigorize it.

Here's my idea of Robert's proof:

You mean, that for each r fixed, we can define $k_M$ as that supremum. The supremum is taken over all $\theta$, right?
So, the supremum is dependend of r, right? So it might be better to write $k_M(r)$.
I also note that we still need to show that $k_M(r)$ is not infinite.

This last thing is problematic, I think. You have that $f\vert_{L_\theta}$is continuous in 0. So for a certain $\theta$, there exists an r such that we can make

$$|f(x_{r,\theta}-f(0)|<\varepsilon$$

But the thing is that our r here depends on $\theta$. So we can't just take the supremum of this expression to get

$$\sup_{theta} |f(x_{r,\theta})-f(0)|\leq \varepsilon$$

since the r's change with $\theta$.

5. Jul 27, 2012

### Robert1986

Here is another try, that might be simpler (my wife is getting ready so I have near infinite time, contrary to what I previously thought :) )

Let $\Lambda$ and $L_\theta$ be defined as in my last post.
Let $k_{r,\theta} = |f(x_{r,\theta}) - f(0)|$ and let $k_{M,r} = \sup_\theta k_{r,\theta}$.

Now, let $\epsilon > 0$. Then there is a $r > 0$ such that $k_{M,r}<\epsilon$. Thus, if $|x| < r$ then $|f(x) - f(0)| \leq k_{M,r} < \epsilon$.

That's shorter, and uses the same ideas, so the mistakes (if there are any) will still be there.

6. Jul 27, 2012

### Robert1986

I think I fiexed some of the notation stuff in my next post. (I really have to leave this time, so this will be short.) But, what if we add that $f$ is bounded? This at least takes away the problem of $k_{M,r}$ being infinite, right?

7. Jul 27, 2012

### Robert1986

Yeah - I see the problem with the $K_{M,r}$.

That messes it all up I think.

8. Jul 27, 2012

### micromass

I don't see this step. For every $\theta$, there exists an r such that $k_{\theta,r}<\varepsilon$. But you can't just take the maximum since r depends on $\theta$.

9. Jul 27, 2012

### micromass

Here's what I'm think about for a counterexample.

In $\mathbb{R}^2$, take the curve $C=\{(x,y)~\vert~x>0, y=x^2\}$. This is of course just the positive half of a parabola.

Now, take the open set

$$A=\{(x,y)~\vert~x\geq 0,~\frac{x^2}{2}\leq y\leq 2x^2\}$$

This is a set around our curve C.

Now, take a function f such that f(x,y)=1 for (x,y) in C. And such that f(x,y)=0 for (x,y) not in A.

Within A, the function connects 0 and 1 linearly (I don't want to write this out formally, but I think it's clear).

This function is not continuous in 0, since the sequence $(1/n,1/n^2)$ converges to 0. But $f(1/n,1/n^2)=1$ and f(0,0)=0.
I think (but am not sure), that it is continuous on every line.

10. Jul 27, 2012

### haruspex

In polar form, f = sin 2θ. Consider the origin.

Last edited: Jul 27, 2012
11. Jul 27, 2012

### Robert1986

OK. I'm going to think about micromass's counter example. But here is another attempt:

Let $\epsilon > 0$. For each $L_\theta$ there is a $\delta_\theta$ such that if $x_\theta \in L_\theta$ and $|x_\theta| < \delta_\theta$ then $|f(x_\theta) - f(0)|< \epsilon$. Now, let $\delta = \inf_\theta \delta_\theta$.

Now, let $x$ be arbitrary. So, $x$ is on some $L_\theta$, say $L_\psi$. Now, if $|x| < \delta_\psi$ then $|f(x) - f(0)| < \epsilon$. But, $\delta \leq \delta_\psi$. Thus, we require that $|x| < \delta$ and so $|f(x)-f(0)| < \epsilon$.

12. Jul 27, 2012

### micromass

Why is $\delta>0$?

13. Jul 27, 2012

### Robert1986

Yeah; that's a problem. That might do me in. I'm going to think about this tonight; I'd like to say that if $\delta = 0$ then $f$ isn't continuous on a line, but I don't think that is true.

14. Jul 27, 2012

### Bacle2

Isnt:

f(x,y)= xy/(x2+y2) ; (x,y)≠ (0,0)

0, if (x,y)=(0,0)

The standard counter? f(c,y), f(x,c) are continuous for c constant/fixed (similar

argument for "slanted" lines ), but f(x,y) not continuous at (0,0) --if we approach along

y=x, the limit is 1/2≠ 0 .

15. Jul 27, 2012

### micromass

But it should be continuous along every line. So it should also be continuous on the line y=x.

16. Jul 27, 2012

### lugita15

What about the example I gave above, ((x^2)y)/(x^4+y^2)? It's limit along any line passing through the origin is 0, but its limit along y=x^2 is 1/2.

17. Jul 27, 2012

### micromass

Oh, I missed that somehow. Yeah, that's a very pretty example actually!! Nice!

18. Jul 27, 2012

### lugita15

If you like that, you can generalize it, by increasing the exponents, so that all quadratics passing through the point yield the same limit. And for any n, you can make it so that all polynomials of degree less than or equal to n yield the same limit. What would be really neat is if you could make it so that ALL polynomials, of all orders, yield the same limit and still have the limit not exist. I don't know how to do that, or even whether it is possible, but that would be really counterintuitive.

19. Jul 28, 2012

### Bacle2

You're right. This one is continuous as a function of y, x individuaally, but not as

a function f(x,y).

20. Jul 28, 2012

### Bacle2

I think the overall issue comes down to the fact that you can have f: R^2-->R not continuous and remove discontinuities by pre-composing with the right g: R^2-->R^2;
in this case, you want g(x,y)=(x, mx+b) to smooth-out the discontinuities and, of course, not introduce new ones.