Quite right. Perpendicular to the radius vector of the sun, the average irradiance is 1321 - 1413 W/m^2. The image of the earth (projected onto the plane normal to the sun) has area [itex]\pi R^2[/itex] (the circle facing the sun), while its surface area (a sphere) is [itex]4 \pi R^2[/itex], a factor of 4 difference - this averages over both latitude variations., and the diurnal cycle. This is ~340 W/m^2 (32 W/ft^2), and then a further reduction (don't know the value) for losses from absorption by the atmosphere. And then large losses in conversion inefficiency (either photovoltaic or thermodynamic (Carnot losses)).More like 30 watts per sq ft. That's total, not what we can actual use with current technology. Solar power research is great, but if we want to use it on a very large scale, we'd better have the collectors in space where they wouldn't significantly block sunlight from reaching earth's surface.lubuntu said:My point is our goal shouldn't be to use less energy because that isn't really a solution to the problem, the solution is to harvest that silly thing 93 million miles away that is spewing out a kw per sq ft!
There's some subtle points involved. For instance, since a solar panel/receiver can be oriented at angles to the earth's surface, to be parallel to the normal plane. So the latitude variation is meaningful for land use, but not meaningful for collector area needed (which is the cost-determining factor). Different adjustments are needed.
Totally agree.Al68 said:The practical solution is nuclear power. Current technology is vastly cleaner and safer than the existing power plants that were designed in our (nuclear) infancy. Even the existing plants are far and away cleaner and safer than other sources. And we won't have to worry about running out of fuel for a VERY, VERY long time.