- #1
shoook
- 23
- 0
Hey guys, I am a complete beginner in the field of electrical engineering so please bare with me in this discussion.
It is my understanding that;
1) The Earth absorbs 3,850,000 exajoules in solar energy every year, which means that if we could harness even 50% of the suns power for just one day that it would be enough to power Earth for many many years.
2) Current photovoltaic panel technology captures and converts about 15% of the solar waves it receives.
3) The majority of this energy loss is due to band gap loss. The optimal band gap when balancing spectrum and voltage is about 1.4 eV.
So here's my question:
Why does less band gap result in lower voltage? Is there not some way to regulate or keep voltage constant while allowing the capture and conversion of a wider band of light?
Thanks for any help!
It is my understanding that;
1) The Earth absorbs 3,850,000 exajoules in solar energy every year, which means that if we could harness even 50% of the suns power for just one day that it would be enough to power Earth for many many years.
2) Current photovoltaic panel technology captures and converts about 15% of the solar waves it receives.
3) The majority of this energy loss is due to band gap loss. The optimal band gap when balancing spectrum and voltage is about 1.4 eV.
So here's my question:
Why does less band gap result in lower voltage? Is there not some way to regulate or keep voltage constant while allowing the capture and conversion of a wider band of light?
Thanks for any help!