Hi everybody, I’m trying to calculate the shape of a boundary line f(x) between two mediums that collimates rays from a point light source. This requires the rays to hit the boundary line under a certain angle, so I calculated the slope m(φ) of the boundary line for a ray with polar angle φ (φ is less than a known critical angle φc). Now in order to get from m(φ) to m(x), I need φ(x). But how do I get it? Specifying the boundary conditions: i) Since I will calculate the boundary line by integrating m(x), specifying f(x=0) – i.e. the minimum distance between source and boundary line – determines the constant of integration. ii) From a physical point of view, I can further specify the radius R of the “lens.” This means x ≤ R and phi(x=R) = φc. So what is φ(x)? Or, in general terms, how can I construct a function f(x) if I know the slope as a function of the polar angle φ (with given f(x=0) and R)? It sounds easy but I’m stuck nonetheless. Any help is appreciated!