- #1

xovangam

- 18

- 0

i'm programming a simulation of a projectile that has a known target off in the distance somewhere in my 3D world, i also know the angle at which i will be firing the projectile. i am firing the projectile on flat ground (so the start and end y-coordinates are the same).

i'm using the equation:

d = (v

^{2}/ g) * sin(2 * theta)

where:

d = distance to target (known)

g = gravity (known)

theta = angle off the horizontal at which I'm firing the projectile (known)

so i solve for

**v**in order to get an initial velocity with which to fire the projectile.

the velocity that i get out of that causes my simulation to always overshoot the target (i.e. the projectile travels a bit farther than

**d**). each step in the simulation is calculated in each tick like so (pseudo-code):

Code:

```
Position.X += Velocity.X * TimeStep;
Position.Y += Velocity.Y * TimeStep;
Position.Z += Velocity.Z * TimeStep - 0.5 * Gravity.Z * TimeStep * TimeStep;
Velocity.Z -= Gravity.Z * TimeStep;
```

and if that's true, so I'm currently scaling the initial velocity that i calculate down a bit by a factor of around 20 percent or so to compensate for the fact that i overshoot my target. so out of curiosity, is there some method by which i could calculate a more "exact" scale factor that was relative to the size of the timesteps that I'm taking ? such that if i raise/lower the size of my timestep, i wouldn't have to re-guess at a scale factor ?

TIA -x.