- #1
WaterAndSand
- 3
- 0
Hey all, I'm relatively new here, so I hope I'm posting in the right place. Please forgive me if not. This seems like a simple question to me, but my dad vehemently disagrees.
Setup:
Let's say we have a vinyl record spinning at a standard 33RPM's. The stylus begins at the outside edge of a standard record and moves slowly towards the center over the duration of play time. Laws of motion dictate that, under a constant angular velocity, a point at the outer edge of said record must inherently move at a faster speed than a point near the center.
My question is this:
Given that the record maintains a constant 33RPMs, an angular frequency of about 3.45 rad/s, and the radius between the stylus and center of the record grows gradually smaller, do the shapes/size of the groove not also have to change to keep the record from sounding as though it is playing slower?
My logic is that because the radius grows smaller, so too does the velocity of the groove as it moves under the stylus needle. Because the stylus remains fixed, this means that the groove moves past the stylus faster at the edge, and slower near the center, and thus would produce an altered sound unless the grooves/player were designed to compensate for this.
My father says this is not so. I am only 24, so I have admittedly less experience than he does with record players, and perhaps I'm misunderstanding their function. He says the radius at which the stylus sits doesn't matter because the record still spins at 33RPMs.
I used a theoretical hundred mile wide record to challenge this theory, given that a 33RPM record would pass under the stylus at hundreds of miles an hour near the edge, and he maintains that the record would play the just same.
Another question I posited which he was unable to answer is what would happen if we then took one ring of the groove from the center, and stretched it out along the outer edge. I maintain that the inner ring which previously passed the stylus in exactly one revolution, would pass in only a fraction of a revolution when placed at the edge, thereby playing faster.
Honestly, his theory seems silly to me, but mine is apparently equally silly to him. I feel like Physics I taught me all I need to know about this problem, but I am unsure if there is something I am simply overlooking here or what. Unfortunately, I have been unable to locate any definitive answer about this using Google, and just don't know enough about records to challenge someone who used them excessively in their heyday.
So, Physics Forum, which is it? Does a record player accommodate for this difference in velocity through design (grooves, stylus arm, etc.) or is the groove cut the same at any point with the radius of stylus location having no effect, and making me a complete fool who doesn't understand how rotation works?
If I am right, anybody have any other ways of trying to impart this to him?
Thanks, and sorry again if I posted in the wrong place.
Setup:
Let's say we have a vinyl record spinning at a standard 33RPM's. The stylus begins at the outside edge of a standard record and moves slowly towards the center over the duration of play time. Laws of motion dictate that, under a constant angular velocity, a point at the outer edge of said record must inherently move at a faster speed than a point near the center.
My question is this:
Given that the record maintains a constant 33RPMs, an angular frequency of about 3.45 rad/s, and the radius between the stylus and center of the record grows gradually smaller, do the shapes/size of the groove not also have to change to keep the record from sounding as though it is playing slower?
My logic is that because the radius grows smaller, so too does the velocity of the groove as it moves under the stylus needle. Because the stylus remains fixed, this means that the groove moves past the stylus faster at the edge, and slower near the center, and thus would produce an altered sound unless the grooves/player were designed to compensate for this.
My father says this is not so. I am only 24, so I have admittedly less experience than he does with record players, and perhaps I'm misunderstanding their function. He says the radius at which the stylus sits doesn't matter because the record still spins at 33RPMs.
I used a theoretical hundred mile wide record to challenge this theory, given that a 33RPM record would pass under the stylus at hundreds of miles an hour near the edge, and he maintains that the record would play the just same.
Another question I posited which he was unable to answer is what would happen if we then took one ring of the groove from the center, and stretched it out along the outer edge. I maintain that the inner ring which previously passed the stylus in exactly one revolution, would pass in only a fraction of a revolution when placed at the edge, thereby playing faster.
Honestly, his theory seems silly to me, but mine is apparently equally silly to him. I feel like Physics I taught me all I need to know about this problem, but I am unsure if there is something I am simply overlooking here or what. Unfortunately, I have been unable to locate any definitive answer about this using Google, and just don't know enough about records to challenge someone who used them excessively in their heyday.
So, Physics Forum, which is it? Does a record player accommodate for this difference in velocity through design (grooves, stylus arm, etc.) or is the groove cut the same at any point with the radius of stylus location having no effect, and making me a complete fool who doesn't understand how rotation works?
If I am right, anybody have any other ways of trying to impart this to him?
Thanks, and sorry again if I posted in the wrong place.