1. Aug 1, 2007

### patapat

1. The problem statement, all variables and given/known data
Determine the fraction of the energy radiated by the sun in the visible region of the spectrum (350 nm to 700 nm). (Assume the sun's surface temperature is 5800 K.)

2. Relevant equations
R=$$\sigma$$T$$^{4}$$

for some reason i cant make the sigma come down, but it's a constant=5.67x10$$^{-8}$$W/m$$^{2}$$K$$^{4}$$

3. The attempt at a solution
I'm not sure if the blackbody radiation equation is relevant, but i'm not sure where to begin with this.

2. Aug 1, 2007

### nrqed

This formula (the Stefan-Boltzmann law) you gave gives the intensity radiated at all wavelength. so it's useless for your question. You need the function $I(\lambda,T)$ which you will have to integrate over the range of wavelength provided (and you could check that integrating from 0 to infinity would reproduce the SB law).

Hope this helps

3. Aug 2, 2007

### malawi_glenn

Then you perform the integration between 350 and 700, then compare the result with the integration from 0 to infinity.

Hint: That last result is also know as a "theorem", what is it called and how does it look like?

4. Feb 17, 2009

### Confusedent

I'm working this same problem right now, and have done everything suggested (and then some), but even using maple to integrate I keep getting that the definite integral comes out to approx. 0 - 0 = 0. This is obviously wrong since more than zero energy gets radiated in the visible portion.

Converting the energy density function to 8pi*hc(kT/hc)^4*int(x^3/(e^x-1)) using
x = hc/lambda*kT, Maple gives me the antiderivative [of int(x^3/(e^x-1))] to be -1/4x^4 + x^3*ln(e^x-1) + 3x^2*polylog(2,e^x) - 6x*polylog(3,e^x) + 6*polylog(4,e^x). Converting 350 nm and 700 nm to values of x and evaluating gives the 0 answer.

Can anyone point out what retarded mistake I must be making? Also sorry about typing the formulas out like that but I haven't figured out how to format it yet.