# Taylor Expansion Without Variables?

This is just part of a larger problem, but I have a basic equation r'=k-g*r, where k and a start out as constants, but then I need to treat everything as if it can vary slightly from the average. For this, I set r=r_ave+dr, g=g_ave+dg, and k=k_ave+dk. Now I need to work these into the first equation, so I guess I need to Taylor expand them, but I don't see how to do that with this sort of equation. Any suggestions?