What is the Minimum Size L for Uniform Density with Less Than 1% Error?

Click For Summary

Homework Help Overview

The problem involves determining the minimum size L of a string with a variable linear mass density defined by μ = μ0[1 + cos(x/R)], such that the average density over this length is uniform with less than 1% error. The original poster expresses confusion regarding the implications of the density reaching zero at certain points and the concept of error in this context.

Discussion Character

  • Exploratory, Assumption checking

Approaches and Questions Raised

  • The original poster attempts to average the density by integrating over the range from 0 to L and dividing by L, but questions the validity of the density being zero at specific points. They also seek clarification on how to interpret the error in their calculations.

Discussion Status

Some participants provide guidance on interpreting the density values and suggest that the zero density can be viewed as negligibly small. There is a suggestion to approach the error by solving for both 1.01U and 0.99U, indicating a productive direction in the discussion.

Contextual Notes

Participants note that the maximum and minimum values of the sine function must be considered, which may affect the calculations regarding the average density.

xdrgnh
Messages
415
Reaction score
0

Homework Statement


The linear mass density in a string is given by μ = μ0[1 + cos(x/R)] where R is a constant. If one averages this density over the large size L it becomes uniform: <μ> = μ0, where <…> means averaging. What is the minimum size L (in terms of R) such that the density can be considered uniform with an error less than 1% ?



Homework Equations





The Attempt at a Solution



So I intergrate with respect to dx over the range o to L then divide by L because I'm averaging and what I get is U+UR/L*sin(L/R). However this is my problem. The initial mass density makes no sense. When x=R*pi the density is zero. How can the density be zero on a freaking string. That makes no sense. Besides that I don't now what is meant by error. Should I equal the U+UR/L*sin(L/R) to .99U then solve?
 
Physics news on Phys.org
I appears in my title I messed up. I mean variable density.

[Moderator's note: thread title has been corrected by Redbelly98]
 
Last edited by a moderator:
xdrgnh said:

The Attempt at a Solution



So I intergrate with respect to dx over the range o to L then divide by L because I'm averaging and what I get is U+UR/L*sin(L/R). However this is my problem. The initial mass density makes no sense. When x=R*pi the density is zero. How can the density be zero on a freaking string. That makes no sense.

You're right that it can't be zero on a real string. Better to think of it as negligibly small compared to the average, and calling it zero is an approximation.

Besides that I don't now what is meant by error. Should I equal the U+UR/L*sin(L/R) to .99U then solve?
Yes. Ideally, it should be solved twice, using both 1.01U and 0.99U.
 
Don't forget that the maximum and minimum values that sin(L/R) can take on are +1 and -1.

Chet
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
4K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 1 ·
Replies
1
Views
12K
  • · Replies 15 ·
Replies
15
Views
1K
  • · Replies 8 ·
Replies
8
Views
2K
Replies
1
Views
6K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K