There is a tube attached to a board in a fashion that a ball can be dropped in the top and the tube curves to the right 90°. If a ball of mass 7.6g is dropped into the top of the tube, what is the minimum height the exit point of the tube needs to be in order for the ball to go at least 1 meter.
Here is a diagram:
Radius of the ball is unknown.
A friction test was done, the board was put on an angle such that the tube made a half-pipe. The board was angled so one side was slightly higher than the other and when the ball was dropped in it would reach the other side of the tube perfectly (because normally friction would stop it from going all the way). One side had to be 1.1cm higher than the other for this to happen, however I don't know how I'm supposed to use this information in relevance to the energy lost due to friction.
Ep = mgh
Ek = 0.5mv2
Basic Kinematic equations etc.
I don't know what equation I need to use for energy of the rolling
The Attempt at a Solution
I found the maximum velocity the ball could have when leaving the end of the tube by determining the kinetic energy based on it's potential energy:
Ep= mgh = 0.0417088 (not rounded)
Ek = 0.5mv2
Ek = Ep
0.5mv2 = mgh
0.5v2 = gh
v2 = 2gh
v = √(2gh)
v = 3.313m/s
That velocity is without any loss of energy due to friction, or due to rolling. Our teacher hinted that the energy lost due to these two factors is in fact significant and will affect the outcome.
My problem is how to determine the energy lost from friction and from rolling, I've done some research into the formulas of a rolling sphere but they all require radius which I am not given. I was told that the radius of the sphere was irrelevant.
Here's the rolling formula anyway:
vcm = ωR
That is the only formula I could make sense of without any outside instruction, that the velocity of the center of mass is equal to the frequency of rotation multiplied by the radius ( I think )
Thanks in advance for any help.
Last edited by a moderator: