Hi all, I have a question about calculating the orbitals of bodies of mass in space (newton's basic laws). I am writing a program to simulate the orbitals of bodies in space -- basically, you define the object's mass, size, location, and initial velocities and watch how they interaction via gravitational attraction. So we know the following equations that govern the motion of these bodies: Magnitude of force due to gravitational attraction: [itex]F=G\frac{m1*m2}{r^2}[/itex] The direction is, at any instant in time, points in the same direction as displacement between the centers of their mass (points in direction of the other body of mass). We know the magnitude and direction of the force, and from this, we would say that the acceleration due to gravity that object 1 undergoes is given by: [itex]a=\frac{F}{m1}[/itex] The direction of which is the same direction as the force vector. Now suppose I asked the following question: what is the total displacement that object 1 undergoes given any arbitrary [itex]Δt[/itex] (and initial velocity is known)? Currently, my program just changes the velocity vector based on the direction of [itex]a[/itex] at whatever moment the refresh was called, and from that changes the object's position, but I know this is not entirely accurate (e.g., this is comparable to finding the area under a curve by diving it into tiny rectangles), because the direction of [itex]a[/itex] is constantly changing. What is the calculus method of doing this? Much appreciated!
You use this: http://en.wikipedia.org/wiki/Euler_method Better are: http://en.wikipedia.org/wiki/Midpoint_method http://en.wikipedia.org/wiki/Runge–Kutta_methods There's lots of code examples online for the specific task you are doing.