Hi, I am hoping to generate an equation that will describe a maximum error in estimation of angles. I would like an advice on the type of math/methods/devices (e.g., differential equations?) that I could use to solve this. Here's the problem: Imagine a square with an oblique line inside the square. Lets say the oblique line touches the left bottom corner of the square. Of interest is the error in estimation of angles between the oblique and the left side of the square. I will refer to it as the angle of an oblique. By error I mean the difference between the true angle and the estimated angle. If the length of two parallel sides of a square is misestimated relative to the other two sides, then the angle of an oblique will also be misestimated in a very specific way. Here's the twist: the misestimation of angles will vary depending on the physical angle between an oblique and the left side of a square. If one of the sides of a square is misestimated by less than 50 %, for example, the maximum error will presumably occur for angles that are about 40 degrees to the left side of the square. I would like to generate a function that will show (1) when the maximum error should occur (i.e., at which angle of an oblique?), (2) what is the maximum error (e.g., angle is overestimated by 12 degrees; therefore error = +12 degrees). Maximum error will depend on percent error in misestimation of lengths of a square, and the physical angle of an oblique. I hope this is clear enough. Thank you for your help!