- #1

IvanSanders

- 8

- 0

I compete internationally in an extreme sport known as 'Long Drive Golf.' Essentially one succeeds by hitting a golf ball the longest distance.

For many decades the manufacture of drivers (the golf clubs which are the longest / most powerful) has been solely based on the premise that if one reduces the head weight then one can swing the club faster and by swinging the club faster then one can hit the golf ball further. (Let us for the moment ignore other factors such as moment of percussion, coefficient of restitution etc.)

As a rough and ready guide:

If one has a swing speed of 130 mph, then using a standard driver (whose head normally weighs 200g) one would expect to hit the ball to a landing point 325 yards away.

For each 1mph increase in swing speed one might reasonably expect to achieve a further 2-3yards in distance. There are adverse effects on head strength etc. if head weight is reduced much below 200g.

Let us imagine that by increasing head weight swing speed was adversely affected. The question nevertheless arises is:

'Would the benefit gained in distance under the formula of FORCE = MASS x ACCELERATION outweigh the distance lost through making swing speed fall by increasing head weight?'

I can measure on my radar meter any drop in swing speed due to increased head weight.

What I cannot assess is the ratio of distance improvement from increasing the headweight based on swing speed remaining constant.

Is it possible to calculate such please? What increased distance could one expect to achieve by increasing head weight to 205g / 210g / 215g?

Thanks,

Ivan Sanders

(sanderslongdrive)