I was bored, my internet was out (from the rain), and so was TV. So all I could was sit there and think... So I started asking myself, how much energy does a single rain drop have and how much electricity could THEORETICALLY (100% efficiency) be produced from falling rain? So when my internet came back up, I thought I would try and calculate it. From searching the internet this is the variables I was able to come up with Typical Rain drop is 2mm in diameter, has a terminal velocity of 6.25856m/2, and has a mass of 4mg. Using the kinetic energy forumla then I get (0.5) * (4x10-6kg) * (6.25856)^2 = 7.833914655x10-5 joules of energy per rain drop. Looking up historical weather data for my city, I find that rain fall average is about 4.25 inches per month or 10.79500cm Density of water is 1g/cm^3. So 1000mg / 4mg is 250 rain drops per cubic cm. 250 * 10.79500 is = 2698.75 rain drops per cm of area 2698.75 * 7.833914655x10-5 = .2114177718 joules per cm per month, multiply by 12 and we get 2.537013261 joules per cm per year. 1 joule = 1 watt second From a quick google search I came up with the average household power usage being 8900 kilowatt-hours or 32040000000 watt seconds. So to power a house for a year of rain you would need a energy converter that covered as area 1.262902346x1010cm^2 or 78473.11366 square miles. So if my math was correct, powering our homes from rain probably isn't the best of ideas. If anyone sees any errors in my math/calculations (and there most likely is, been a couple years since i've taken any physics classes) please let me know.