# String with variable density

## Homework Statement

The linear mass density in a string is given by μ = μ0[1 + cos(x/R)] where R is a constant. If one averages this density over the large size L it becomes uniform: <μ> = μ0, where <…> means averaging. What is the minimum size L (in terms of R) such that the density can be considered uniform with an error less than 1% ?

## The Attempt at a Solution

So I intergrate with respect to dx over the range o to L then divide by L because I'm averaging and what I get is U+UR/L*sin(L/R). However this is my problem. The initial mass density makes no sense. When x=R*pi the density is zero. How can the density be zero on a freaking string. That makes no sense. Besides that I don't now what is meant by error. Should I equal the U+UR/L*sin(L/R) to .99U then solve?

I appears in my title I messed up. I mean variable density.

[Moderator's note: thread title has been corrected by Redbelly98]

Last edited by a moderator:
Redbelly98
Staff Emeritus
Homework Helper

## The Attempt at a Solution

So I intergrate with respect to dx over the range o to L then divide by L because I'm averaging and what I get is U+UR/L*sin(L/R). However this is my problem. The initial mass density makes no sense. When x=R*pi the density is zero. How can the density be zero on a freaking string. That makes no sense.

You're right that it can't be zero on a real string. Better to think of it as negligibly small compared to the average, and calling it zero is an approximation.

Besides that I don't now what is meant by error. Should I equal the U+UR/L*sin(L/R) to .99U then solve?
Yes. Ideally, it should be solved twice, using both 1.01U and 0.99U.

Chestermiller
Mentor
Don't forget that the maximum and minimum values that sin(L/R) can take on are +1 and -1.

Chet