Hello everyone. I'm looking through an old book on optics and something there has me really confused. I attached the two pages I'm referring to, or you can find them here: http://www.maths.tcd.ie/pub/HistMath/People/Hamilton/Rays/
as pages 14 and 15 in the pdf.

Basically, what's going on is that he is talking about rays of light hitting a mirror. He wants to find the equation of a mirror surface that reflects a given system of incoming rays to a single focal point, without yet saying anything about the nature of the incoming rays. The cosines of the angles an incoming ray makes with the x, y, and z-axis are α, β, and γ, while the the cosines for the reflected ray are α', β', and γ'. Equation (E) was already derived as the basic equation for dealing with reflections. For reference here, equation (E) is
[tex] (\alpha + \alpha ')dx + (\beta + \beta ')dy + (\gamma + \gamma ')dz = 0[/tex]
Next he argues that
[tex]\alpha 'dx + \beta 'dy +\gamma 'dz[/tex]
is an exact differential with an argument that makes sense. My trouble comes next. He says

I understand neither why that expression needs to be an exact differential of two variables, nor why that last equation (equation (F)) represents that condition. Any help would be appreciated!

If anyone even knows what is meant by "an exact differential of a function of the two variables which remain independent" that would be helpful. I guessed that it meant I would assume z=F(x,y), [itex]dz=\frac{\partial F}{\partial x}dx + \frac{\partial F}{\partial y}dy[/itex], so I substituted that in to see what equation that would give me when I imposed the condition of an exact differential, but I always got an equation where F appears and I can't get rid of it.

Thanks for the reply andrien. My understanding was that he was saying (αdx+βdy+γdz) was an exact differential. I tried to deal with this by substituting my expression for dz in to get
[tex](\alpha + \gamma\frac{\partial z}{\partial x})dx + (\beta +\gamma\frac{\partial z}{\partial y})dy[/tex]
And this would be the exact differential. Using the condition you mentioned, I get
[tex]\frac{\partial (\alpha + \gamma\frac{\partial z}{\partial x})}{\partial y} = \frac{\partial (\beta +\gamma\frac{\partial z}{\partial y})}{\partial x}[/tex]
But I don't see a way to get this to reduce to equation (F) as required.

I actually made a bit of progress on this and answered my first question, although I still don't know about the second. As for why (αdx+βdy+γdz) is an exact differential, it's now pretty clear that since it was proven
[tex]-d\rho '=\alpha 'dx + \beta 'dy + \gamma 'dz[/tex]
Combining this with the original equation
[tex] (\alpha + \alpha ')dx + (\beta + \beta ')dy + (\gamma + \gamma ')dz = 0[/tex]
gives
[tex]\alpha dx + \beta dy + \gamma dz = d\rho '[/tex]
But this equation applies only to the solution of the differential equation for, say, z as a function of x and y, and is not an identity for the functions α, β and γ. So the expression is, as Hamilton said, an exact differential in the two variables that remain independent.

The only question I still have is how that condition leads to the equation
[tex](\alpha + \alpha ')(\frac{\partial \beta}{\partial z} - \frac{\partial \gamma}{\partial y}) + (\beta + \beta ')(\frac{\partial \gamma}{\partial x} - \frac{\partial \alpha}{\partial z}) + (\gamma + \gamma ')(\frac{\partial \alpha}{\partial y} - \frac{\partial \beta}{\partial x})=0[/tex]