1. The problem statement, all variables and given/known data You have the function f:R->R and f(x) = f(x+k) where k is in R and k>0. Prove or disprove: 1) If f is continues at x0 then it's also continues at x0 + k 2)If the limit of f at infinity is 0 then f(x)=0 for all x in R. 2. Relevant equations 3. The attempt at a solution 1)Yes: If for every epsilon>0 there's a lambda>0 so that for every x where 0<|x-x0|<lambda => |f(x) - f(x0)| < epsilon then for every 0<|x-(x0+k)|<lambda => 0<|(x-k)-x0|<lambda => |f(x-k) - f(x0)| < epsilon => |f(x) - f(x0+k)| < epsilon 2)Yes: If we have an x0 in R so that f(x0)>0 then we can choose epsilon=f(x0)/2 and then |f(x0)-0| > epsilon. So then for every N>0 we can find a k so that x0+nk > N and so |f(x0+nk)-0| > epsilon and we've proved that the limit of f at infinity isn't 0 - which is a contradiction to the information given in the question. (if there's some x0 where f(x0)<0 the proof in analogues) So for all x in R, f(x)=0. Are those right? Thanks.