Distance between 2 axis in Parallel Axis Theorem

AI Thread Summary
The discussion revolves around understanding the distance (D) between the center of mass and a corner axis in the context of the Parallel Axis Theorem. The moment of inertia for a rectangular sheet through a corner is given as I = 1/3 M(a^2 + b^2). Participants clarify that D is derived from considering the center of mass's position relative to the corner, specifically using half the dimensions of the sheet. The relationship involves applying Pythagorean theorem principles to determine the correct distance to the corner. Overall, the conversation emphasizes the geometric interpretation of axis placement in relation to the dimensions of the rectangular sheet.
Sunbodi
Messages
22
Reaction score
0

Homework Statement


The moment of inertia for a perpendicular axis through the center of a uniform, thin, rectangular metal sheet with sides a and b is (1/12)M(a2 + b2). What is the moment of inertia if the axis is through a corner?

The answer is given as this was a powerpoint lecture and it is: I = 1/3 M(a^2 + b^2)
I'm not looking for how to solve the whole thing however as the moment of Inertia is given in the context of the problem. I'm trying to understand how D was found to be: (a^2/2^2 + b^2 /2^2).

Homework Equations


I = I (cm) + Md^2

The Attempt at a Solution


D is meant to be the distance between the the new axis and the center of mass, if I personally were to solve this I'd state that the center of mass is half the distance between the axis that contains a and the axis that contains b. If that's the case, the distance from the center of mass to a diagonal would be a^2 + b^2 because you're increasing both a and b values by the same distance it originally was from the center of mass.
 
Physics news on Phys.org
For clarification: You're trying to understand how the distance between two different axis on a sheet depends on the lengths of the sheet?

I'm asking cause I see questions, but then answers to those questions in later sentences.
 
RomegaPRogRess said:
For clarification: You're trying to understand how the distance between two different axis on a sheet depends on the lengths of the sheet?

I'm asking cause I see questions, but then answers to those questions in later sentences.
I trying to understand why D = (a^2/2^2 + b^2 /2^2). To me it seems as if it should simply be D = (a^2 + b^2)
 
Sunbodi said:
the center of mass is half the distance between the axis that contains a and the axis that contains b
a and b are the dimensions of the plate. In what sense does an axis contain them?
Sunbodi said:
the distance from the center of mass to a diagonal
The distance from the centre to a diagonal is zero. You want the distance to a corner.
It's a/2 parallel to one axis, then b/2 parallel to the other. What does Pythagoras have to say on the matter?
 
  • Like
Likes Sunbodi
haruspex said:
a and b are the dimensions of the plate. In what sense does an axis contain them?

The distance from the centre to a diagonal is zero. You want the distance to a corner.
It's a/2 parallel to one axis, then b/2 parallel to the other. What does Pythagoras have to say on the matter?

Thank you so much! The second part of your comment really helped me. I've noticed how you're consistently on this forums helping out people whether it comes to high level physics or high school stuff and it's really well appreciated.
 
Sunbodi said:
Thank you so much! The second part of your comment really helped me. I've noticed how you're consistently on this forums helping out people whether it comes to high level physics or high school stuff and it's really well appreciated.
You are welcome.
 
Kindly see the attached pdf. My attempt to solve it, is in it. I'm wondering if my solution is right. My idea is this: At any point of time, the ball may be assumed to be at an incline which is at an angle of θ(kindly see both the pics in the pdf file). The value of θ will continuously change and so will the value of friction. I'm not able to figure out, why my solution is wrong, if it is wrong .
TL;DR Summary: I came across this question from a Sri Lankan A-level textbook. Question - An ice cube with a length of 10 cm is immersed in water at 0 °C. An observer observes the ice cube from the water, and it seems to be 7.75 cm long. If the refractive index of water is 4/3, find the height of the ice cube immersed in the water. I could not understand how the apparent height of the ice cube in the water depends on the height of the ice cube immersed in the water. Does anyone have an...
Back
Top