This is of my own interest/ practice. 1. The problem statement, all variables and given/known data A thin rod (of width zero, but not uniform) is pivoted freely at one end about the horizontal z axis , being free to swing in the xy plane (x horizontal, y vertically down). Its mass is m and its CM is a distance a from the pivot. The rod is struck with a horizontal force F which delivers an impulse F Δt = ξ a distance b below the pivot. Find the impulse η delivered to the pivot. 2. Relevant equations F Δt = ξ @b L = T × Δt = r⋅J p = mvCM T = r × F v = ωr 3. The attempt at a solution So first I found the angular momentum in terms of impulse. L = Iω = b ξ (j×i = -k). Using the relation, I solve for angular velocity: ω = bξ/I and find the linear momentum. Now here's my trouble, my friend is telling me I should add impulse and momentum for the total impulse delivered to the pivot. But I don't understand why that would be true.