I'm looking at concentrating light (e.g. sunlight) with a lens, and trying to figure out the maximum concentration I can get. If you have a perfectly collimated source, a lens can focus that light down to a point, but when the source has some divergence, the point you can focus down to will have some finite area. If a lens has a focal length f, then the radius of the spot size you get is f*tanθ, where θ is the angle of divergence of the source (this would be ~0.3° for the sun, for example). The second figure on this page should make the reason for this clear. The concentration is then the ratio of of the lens area to spot area, which is given by ∏R^2/∏(f*tanθ)^2 if R is the radius of the lens. What I'm confused about is that this seems to contradict the results you get from an etendue type argument. If a lens with f = R is used (about the shortest focal length you can get for a glass/plastic lens) then the concentration ratio given above simplifies to 1/(tanθ)^2. For small θ (which will generally be the case in concentration applications), this is very close to 1/(sinθ)^2 which is the limit from the etendue perspective when your divergence angle after the spot is 90°. However if you look at the geometric setup of a concentrating lens when f = R, the divergence angle after the spot will be ~45°. So in my original way of approaching the problem, etendue is not conserved. Where did I go astray in my approach or what am I misunderstanding? Many thanks in advance to anyone who can help me understand this!
With R=f you have 45° deviation from the central axis in "both" (all) directions, for an opening angle of 90°.
I should clarify that all the angles I mentioned in the original post were half-angles (which is the relevant angle for calculating etendue), and that the maximum concentration condition (1/(sinθ)^2) occurs when the outgoing divergence half-angle is 90°. So even if the total angle is 2*45° = 90°, that doesn't satisfy the maximum concentration condition from the etendue standpoint.
A half-angle of 90° means full coverage of one hemisphere. But then you increase the solid angle from pi θ^2 to 2 pi, so you get a maximal concentration of 2/θ^2 (for small θ).
I think I see the difference. The wikipedia page calculates the maximal concentration on a surface (where the angle of incidence plays a role), and the other approach consideres the maximal concentration for a "point" (where it does not). Concerning the point: The best possible concentration for any point is "the sun is visible in all directions" (without "downwards"). This corresponds to a solid angle of 2 pi. The sun has a solid angle of approximately pi θ^2, so the best concentration is just the ratio of the two.
Thanks for your help so far! It makes sense to me that concentration is the ratio of the solid angle the spot "sees" to the solid angle the lens "sees," and given that, the maximum concentration ratio should be 2/θ^2. (I am still confused as to why this seems to differ from the etendue argument, as well as the value given for maximum solar concentration in the literature) Going back to my lens setup, if I call the angle at the spot β, then the solid angle the spot sees is 2∏(1-cosβ). This gives a concentration of 2(1-cosβ)/θ^2. Compare this to the ratio of areas, R^2/(f tanθ)^2 which for f = R is simply 1/θ^2 as stated earlier. Setting these equal gives β = 60° which is quite far from the expected value of 45° given the geometry of the setup. I am thus tempted to think there is something wrong with my assumption that the spot size at the focal point of the lens should be f*tanθ, but as far as my understanding of optics goes, that is correct.
Actually, I'm not convinced that the ratio of solid angles should give you concentration ratio. This would be true if intensity is uniform both before and after the lens, but it isn't immediately clear to me whether or not this is the case. I need to think about it a bit more I think.