Distance of an object launched by a rubber band decreases as its mass increases

AI Thread Summary
The discussion focuses on proving that the distance an object launched by a rubber band decreases as its mass increases, using algebraic principles. It applies Hooke's Law, stating that the rubber band acts like a spring with a constant force when displaced equally for different masses. According to Newton's Second Law, a constant force results in inversely proportional acceleration to mass, meaning as mass increases, acceleration decreases. This relationship leads to a kinematics problem where the distance traveled is affected by the object's mass. The key takeaway is that increased mass results in decreased acceleration, ultimately reducing the launch distance.
Nastyusha
Messages
6
Reaction score
0
I need to prove that the distance of an object launched by a rubber band decreases as its mass increases, algebratically. Can anyone help me? Thanks.
 
Physics news on Phys.org
Treat the rubber band as a spring that obeys Hooke's Law (F=-kx). If you displace the rubber band from equilibrium by the same amount for each mass, then the force you apply will be constant. From Newton's 2nd Law, if your force is constant, acceleration is inversely proportional to mass. Then it becomes a kinematics problem.
 
Can you explain that further? I'm not quite getting it.
 
Nastyusha said:
Can you explain that further? I'm not quite getting it.

JohnnyA42 wanted to point this out: from Newton's 2nd law you have F / a = m. Since F is constant, if you increase m, a must decrease.
 
Kindly see the attached pdf. My attempt to solve it, is in it. I'm wondering if my solution is right. My idea is this: At any point of time, the ball may be assumed to be at an incline which is at an angle of θ(kindly see both the pics in the pdf file). The value of θ will continuously change and so will the value of friction. I'm not able to figure out, why my solution is wrong, if it is wrong .
TL;DR Summary: I came across this question from a Sri Lankan A-level textbook. Question - An ice cube with a length of 10 cm is immersed in water at 0 °C. An observer observes the ice cube from the water, and it seems to be 7.75 cm long. If the refractive index of water is 4/3, find the height of the ice cube immersed in the water. I could not understand how the apparent height of the ice cube in the water depends on the height of the ice cube immersed in the water. Does anyone have an...
Back
Top