This is a continuation of my previous post, so please bear with me 1. The problem statement, all variables and given/known data Find the zeroes of the following function: y= 2 sec (-2x+180deg) + 3 2. Relevant equations Break down the equation into: y= 2 sec -2(x-90deg) + 3; Finding zeroes, means finding the x values when y=0, therefore: 0=2 sec [-2(x-90deg)]+3 3. The attempt at a solution 0=2 sec [-2(x-90deg)]+3 -3 = 2 sec [-2(x-90deg)] -1.5 = sec [-2(x-90deg)] -1.5/sec = [-2(x-90deg)] ^ this is where I get stuck. Because the inverse of sec or cos (sec^-1; cos^-1) is Error or Undefined. So the whole thing explodes. Now, I can tell when the graph crosses the x-axis (and therefore has a y=0 value) on a Graphing tool, but how can I find this out algebraically? Thanks for the assistance.