This is quite a tough one. I'm working on a flight simulation and aircraft are modeled as a rigid solid moving under drag. I'm making the AI system so I need to plan a path for the airplane - thus i need to spin it around so it points in the right direction. Rotation is determined the usual way, by integrating torque twice. Torque is equal to the force applied using the flight stick to which a linear drag is applied. Thus T = max_rotational_velocity_konst * StickInput - w * C StickInput is in [-1..1] C is the drag constant and is calculated from the 'max_rotational_velocity_konst' constant and the 'time_to_reach_max_vel' constant - so that the designers can model a plane that achieves a certain angular velocity in a certain time (if the stick is kept at maximum during that time). Next, the angular velocity is equal to the integral of Torque. w = (T / (inv_mass * inertia)) * dt; And finally, the angle of rotation is equal to alpha = w * dt Unfortunately, several stick movements need to be input each frame (so that behaviors can be tweened - ground avoidance for one should increase in priority as the plane gets closer to the ground) and I can't change this. Thus my AI will input stick commands to make the plane roll to the right direction by measuring the angle between its current rotation and the desired rotation. I take the CrossProduct of the two 'Up' vectors so I get the sinus of the angle. I use this value as the StickInput value - thus it will achieve 0 input when its in the right angle. Thus: T = max_rot_speed * sin(alpha) - w * C w = integral(T) alpha = integral(w) What I want is to find the angle as a function of time in order to determine the time it will take for the aircraft to achieve a certain rotation. I've already found a way to do it for constant force from here, but I don't know how to use it for a varying force. Any ideas?