When you add impurity atoms to a material, the yield strength often increases by a process known as solid solution hardening. This is because the impurity atoms create a barrier to dislocation motion. The literature describing this phenomenon dates back to the 1960s with some famous papers by R.L. Fleischer. Anyway, this hardening is supposedly dependent on the spacing between the impurity atoms. Since the spacing between impurities should be proportional to the impurity concentration in some way (i.e., more impurities, closer spacing between them), increase in yield strength is often plotted against impurity concentration. Experimentally, this increase has been shown to vary with the square-root of impurity concentration. *My question is this: Why is it a square-root dependence, and not a cube-root dependence? Atomic concentration is per unit volume, so if you increase impurity concentration, the spacing between these impurity atoms should scale with the cube root of the concentration, right? Every resource I have found states that impurity spacing varies with the square-root of impurity concentration but I have yet to find a good explanation for this. Am I missing something? Can someone explain?