Greetings all. I am attempting to write a program that will calculate how long a spaceship will take to move from Point A to Point B. This is actually an attempt to model my favorite computer game, so it isn’t 100% accurate in comparison to the real world. The developers introduced an element of drag to make the flying more user friendly. I assume that the spacecraft starts out motionless, then applies maximum thrust from it’s engines instantaneously. (In reality it takes perhaps .05 seconds to move the throttle slider on my JS from 0 to 100, so I don’t think that this difference is what is throwing me off). For the very early stages of my program where I am just trying to figure out the proper calculations, I am assuming that Point B is 10000 meters away. The mass of the spaceship is 67620000g and the thrust provided by the engines at max throttle is 6400000 of whatever the applicable units of thrust are. My understanding of the calculations required is that I need to use the calculations used in jet flight without the lift and gravity aspects. So far my internet research has lead me to the equations: DistanceTraveled = InitialVel * Time + 1/2 Accel * Time ^ 2 Then: Accel = ( Thrust - Drag ) / Mass And: Drag = DragFactor * Vel ^ 2 The only problem is that the first equation assumes the acceleration is a constant, when in my scenario, acceleration varies due to drag. I attempted to use looping and small time samples to model this, basically assuming that acceleration is constant during a very small moment of time (.01 seconds). (BTW, the DragFactor of the ship in my scenario is 35.5) (Another side note: The numbers given for thrust and the drag coefficient work out perfectly to make the ship stop accelerating at the top speed it has in game, so I don’t think those equations are wrong.) Unfortunately, the results of this method vary wildly depending on the size of the time sample I use. If I use incredibly small samples (I went down to .00001, that actually took my CPU a few seconds to go through that many loops), it ends up with a result approaching the amount of time it takes to cover that distance if the spacecraft started out at it’s maximum velocity. If I use larger sample sizes, then the result is significantly longer than what it actually takes to perform in game. Using a stop watch, I did time trials that showed the ship taking ~28.5 seconds to cover that distance from a full stop. I’ve tried to investigate calculus operations that would help me perform these calculations, but have found nothing that seemed to apply. Of course, I’ve forgotten anything I learned in Calculus in High School now, so I may have just not understood something and how it would apply. If anyone knows how to adjust the equations I have to continuously calculate the changing acceleration due to drag, please let me know. If anyone here knows programming and wants to look at my code, let me know and I’ll post a copy of the loop. *Edit* Thanks to this scenario being in a game, we do not have to address changing mass due to fuel consumption. Thanks, Istanbul P.S. No, I'm not from Turkey, sorry to disappoint.