1. The problem statement, all variables and given/known data (a) A particle moves in a plane with constant radial velocity [(r)\dot] and constant angular velocity [θ\dot]. When the particle is at distance r from the origin, determine the speed of the particle. Data: [(r)\dot] = 3.2 m/s; [θ\dot] = 2.3 rad/s; r = 3.2 m. (b) When the particle is at distance r from the origin, determine the magnitude of the acceleration. 2. Relevant equations a=v^2/r 3. The attempt at a solution This question was posted on these forums before which I found using the search function. The answer to part (a) was derived after the help from that thread. The result is 8.026 m/s. I am not sure about part (b), however. In my search I have found many suggesting that a=v^2/r. Why is this the case? Also, I haven't been able to get that to work. What I tried is this: Find the circumference of the circle with radius 3.2m which is 20.1062 m. Find the period of the particle which is 2*pi/2.3 which is 2.73182 seconds. Using that I was thinking I could find the angular velocity in m/s which I calculated to be 7.447 m/s. If I use that in the equation a=v^2/r I get a=7.447^2/3.2=16.93 m/s^2 which is incorrect. What am I missing? Also, where does a=v^2/r come from? Thanks in advance for the help!