- #1
Doom of Doom
- 86
- 0
I'm doing a lab in my physics class on rotational motion. Part of the lab was understanding that the device we are working with is not frictionless, and that we will have frictional losses. We got the thing spinning at 500 deg/s, and had a device give us measurements of the angular velocity every second until it reached 150 deg/s (it took about 15 minutes).
Then, using the relation ω=ω0*e^(-βt)
where ω0 was the initial angular velocity and β is some constant.
We then graphed the values of ln(ω) vs time, and found a linear relationship:
ln(ω)=ln(ω0) -βt
Thus, from the slope, we determined a value for β, and therefore the frictional torque depends on angular velocity. However, I still don't understand why.
Can you guys help explain this?
Then, using the relation ω=ω0*e^(-βt)
where ω0 was the initial angular velocity and β is some constant.
We then graphed the values of ln(ω) vs time, and found a linear relationship:
ln(ω)=ln(ω0) -βt
Thus, from the slope, we determined a value for β, and therefore the frictional torque depends on angular velocity. However, I still don't understand why.
Can you guys help explain this?