Hi everyone I am have done part of this problem but i am wondering how you would go about it: A ball of mass M and radius R1 on the end of a thin massless rod is rotated in a horizontal circle of radius Ro about an axis of rotation AB. a. Considering the mass of the ball to be concentrated at its center of mass, calculate its moment of inertia about AB. b. Using the parallel axis theorem and considering the finite radius of the ball, calculate the moment of inertia of the ball about AB. c. Calculate the percentage error introduced by the point mass approximation for R1 = 10 cm and Ro = 1.0m I am having difficulty with these problems, I was wondering what you would suggest? Thanks a lot.
If you're making the point-mass approximation, the moment of inertia will be given by Mr^2 with r = Ro + R1 If you consider the moment of inertia of the sphere, the sphere has a moment of inertia of 2(Mr^2)/5 about its center (you either look that up or compute it using the definition of moment of inertia; here, r = R1). Using the parallel axis theorem, you should find that the moment of inertia about the axis AB is I = Io + MRo^2 a little cleaner: [tex] I = \frac 2 5 MR_1^2 + MR_o^2 [/tex] For the last part, you just have to plug in numbers. Hope that helped.
I don't know if this makes a difference james, but Ro goes all the way to the center of the sphere, so I don't know if you must include R1 for part A.
You're right; for the first part, r = Ro, not r= Ro + R1 like I said. (The second part is unchanged.) Thanks.