Optimizing Areal Density in a Square with Randomly Distributed Circles

  • Thread starter Thread starter ManuelCalavera
  • Start date Start date
ManuelCalavera
Messages
3
Reaction score
0
Hi,

I'm not sure if this is the right forum to put this question in as it's not a homework problem but it is a math/statistics problem. I'm really not sure how to even start.

So the problem is this:
You have a square with side lengths W and circles that have diameter's that are randomly distributed according to the Gaussian distribution. The assumption is that they are taken from a random sample of independently chosen circles that are drawn from the same Gaussian distribution,
with a chosen mean and set sigma. You can choose the mean of the distribution but the standard dev is set. There is a minimum mean value you can choose.

You want to find the mean value of the diameter that will maximize the areal density (amount of circles per unit area) you can fit in the square. The circles can't overlap and they must be whole circles.

I'm almost sure it is the minimum value so I guess I just have to prove that. There might be issues with the formulation of the problem but I am hoping they can be avoided if the minimum mean value is always the value that will maximize the areal density regardless of how the circles are chosen/sampled/etc.

My background is in engineering/physics so I don't have a first principles math education so I'm really not sure how to even begin the problem.
 
Physics news on Phys.org
This sounds like a challenging problem. You say that there is a minimum mean you can choose. That's a good place to start. From that point, try to find the formula for areal density. Are you allowed to pack the circles in any way you like to maximize the total number in the square?
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top