- #1
jimmylegss
- 62
- 0
I was having a discussion with a friend, and my side of the argument was that for solar to work you also need a lot of fossil fuel running in the back ground just during the day, making it very expensive. Let's take New york as an example.
Let's assume they have enough solar panels to produce 150% of the power needed on average during a sunny day. But when it gets cloudy , you could have a lot of power outages if the sun suddenly goes away, meaning you need fossil plants in the back ground, wasting a lot of energy. Because it takes time to fire up power plants.
Now the question is, could you set up a network between all cities with a range of a 2-3k miles where if it becomes cloudy and rainy in one city (so solar can only sustain about 20-30% of power needs), other cities within a range of hundreds, or possibly a several thousand mile can jump in? Assuming ofcourse every city in this network has power left over during sunny days.
How much would be lost per mile? Or would we really need batteries to make this sustainable.
Could it be possible to set up massive solar farms in the desert in california? And power all of the US?
Edit: my understanding is that A = V/ Ohms
So if you generate a certain amount of Amps (that is electrons per second right?), and you want to push a lot of them over large distances, you need to send a lot of them at once? Meaning Voltage is high? So if differences would be small, and only small amounts of power would be sent at the time, more energy would be lost due to resistance? Or am i misreading that formula?
Let's assume they have enough solar panels to produce 150% of the power needed on average during a sunny day. But when it gets cloudy , you could have a lot of power outages if the sun suddenly goes away, meaning you need fossil plants in the back ground, wasting a lot of energy. Because it takes time to fire up power plants.
Now the question is, could you set up a network between all cities with a range of a 2-3k miles where if it becomes cloudy and rainy in one city (so solar can only sustain about 20-30% of power needs), other cities within a range of hundreds, or possibly a several thousand mile can jump in? Assuming ofcourse every city in this network has power left over during sunny days.
How much would be lost per mile? Or would we really need batteries to make this sustainable.
Could it be possible to set up massive solar farms in the desert in california? And power all of the US?
Edit: my understanding is that A = V/ Ohms
So if you generate a certain amount of Amps (that is electrons per second right?), and you want to push a lot of them over large distances, you need to send a lot of them at once? Meaning Voltage is high? So if differences would be small, and only small amounts of power would be sent at the time, more energy would be lost due to resistance? Or am i misreading that formula?
Last edited: