View Single Post
mezarashi
#11
Jun17-05, 12:39 PM
HW Helper
P: 660
Quote Quote by rbj
how is it, then, that they tell us that there is about 1360 watts/m^2 of solar radiation at our distance of 93,000,000 miles from the sun (and from that they calculate that the output of the sun is about 3.8 x 10^26 watts)? how do they measure that? i think there is some way to calibrate photosensors, but i dunno what it is.

r b-j

I did not mention that photodiodes cannot be calibrated. I was assuming you had a home made photo sensing device and wanted to calibrate it. Manufactured photodiodes are undoubtedly pre-calibrated. Calibration is when you set two points of known values.

Lights off - zero intensity: photodiode produces a amount of current - calibrated as 0
Lights on - 100W light bulb: photodiode produces b amount of current - calibrated as x (where x is the known reference intensity - you can use physics to figure out what x is)


Quote Quote by HungryChemist
I don't think I can just read off how much current has changed from your photodiode. What you're trying to show(as a demonstrator of the photoelectric effect) is that as you increase the intensity of the light, you want to show that the photoelectric current also increase by some factor. But, don't you already have to know how much you have increased the intensity of the light in order to make fair judgement?
For the most part, what you do is increase the power of the light source. Say originally you dissipated 30W of energy into the light source, now you dissipate 60W into it. Electric power can easily be measured. Now your intensity is known. Intensity is a function of power and distance right.