I am attempting to verify the solution to a variable angular inertia problem. Initially, the problem is set up this way: Given a sphere with a diameter of 0.125 meter and a mass msfw=0.997 kg on a massless rod with a perpendicular axle such that the sphere can rotate about the axle at a distance from the center of rotation, r , of 15 inches and can increase continuously to 20 inches. The period to increase is 0 to 10 seconds, t. Thus, r as a function of t is: r(t)= 15…20 inches. The moment of inertia is calculated using the parallel-axis theorem (Steiner’s Theorem): Ivr(t) = 2/5*msfw*R2 + msfw*r(t)2 where R is the radius of the sphere The angular velocity is [itex]\omega[/itex] and is to remain constant at 1000 rpm. The question to answer is: What is the torque ([itex]\tau[/itex]) input on the axle required to maintain a constant angular velocity ([itex]\omega[/itex])? Since torque equals dL/dt the solution is simple: [itex]\tau[/itex](t) =[ (Ivr(ti)*w)-(Ivr(te)*[itex]\omega[/itex])]/t = 1.226 N*m To check this answer, I attempted to solve the same physical set up using linear momentum. Treating the sphere as a point-mass, I calculated the linear velocity based on the angular velocity (1000 rpm). Then calculating the linear velocity of the mass at 15 in. radius and also at 20 inch radius, I then used the two different velocities to calculate the beginning momentum and the ending momentum. Then using dp/dt to find the resultant force on the mass and lever arm (rod), torque equals force times r. The result I get is .703 N*m vs. 1.226 N*m using angular momentum. Can anyone offer an explanation why this is so?