- #1

- 4

- 0

I have been mulling this around in my head for awhile and cannot make sense logically.

In thinking about a sphere falling through some random liquid, how does decreasing the diameter of the sphere affect the rate at which it falls?

I am thinking of two formulas and the seem to contradict each other in terms of their results.

First would be Stoke's law dealing with drag force:

F(d)= 3(pi)μVd

Here if I reduce the radius, the force of drag decreases which makes me think the sphere would fall faster because it has less resisting its fall.

The next would be a derived equation using the force of buoyancy, drag force, and weight. This assuming these are the only three forces on a falling sphere. It is solved for velocity.

V= [2r^2(ρ(sphere)-ρ(fluid))g]/(9μ)

Here it appears that if the radius is reduced, the velocity is also reduced and it therefore falls slower.

My "logic" tells me a smaller sphere would fall faster, but it has failed me many times in physics.

Thanks for the help.