- #1
caliente
- 2
- 0
I recently came upon a reference to http://www.focusfusion.org, which led me to discover http://www.prometheus2.net and http://www.electronpowersystems.com. What these companies have in common is they propose to generate power using "clean" proton-boron fusion in a pulsed reactor. By pulsed, I mean they would heat the fuel essentially from zero to a sufficient high temperature for fusion, extract power and repeat the cycle one to 1000 times per second. Another thing they all have in common is that they're soliciting funds from the public to develop their technology.
In trying to decide whether these guys were quacks, I started think about whether it would be possible in a pulsed proton-boron reactor to get more energy out than you put into heat the reactants. I notice these companies have been discussed here before, and since I'm an EE by training, not a physicist, I thought I would post my thoughts to get some feedback.
It seems a critical parameter is the fraction of fuel ions you can get to react in each cycle. If none react, all the energy you put into heat the fuel is wasted. If all of them react, you have saved the world. Somewhere in between is a break even point.
Let's suppose that the fraction of fuel ions that react is f. For proton-boron fusion, you have to heat the fuel to approximately 1E9 Kelvin, or 120 keV in order for fusion to occur (all parameters taken from http://en.wikipedia.org/wiki/Nuclear_fusion) . Its my understanding that a temperature of 120 keV means the average energy of the ions is 120 keV, and I'm assuming that means you have to add 120 keV of energy per ion in order to achieve that temperature.
Let's generously suppose you can recover 50% of the heat energy you put in. In addition, for each ion that reacts, you get 8.7/2 MeV out (8.7 MeV per 2 ions, one hydrogen ion and one boron ion).
Therefore, the total energy input is
Ein = 120 kEv per ion
and the total energy out is
Eout = Ein/2 + f * 8.7/2 MeV per ion
At breakeven Ein = Eout, or
f * 4.25 MeV = 60 keV
f = 1.4%
As I understand it, f is related to temperature, density, time and the reaction "cross section". Again from wikipedia, the reaction rate r = n1 * n2 * <ov> where n1 and n2 are the reactant densities and <ov> is an average of the ion velocities over the reaction cross section. At 120 keV, <ov> for proton-boron fusion is 3E-27 * 120^2 = 43E-24 m^3/s.
The fraction of fuel burned f = t * r / (n1 * n2) or simply t * <ov>. In order to burn 1.4% of the ions, you would have to sustain the reaction for
t = 1.4% / 43E-24 = 3.3E20 s ~= 103 trillion years.
Basically, what this computation says is that in order to get any net energy out of a proton-boron reactor, you have to heat the reactants to 120 keV and hold them at that temperature for 103 trillion years.
In comparison, the computations for deuterium-tritium are:
Ein = 13.6 keV per ion
Eout = Ein/2 + f * (3.5 + 14.1)/2 MeV per ion
f = 0.077 %
<ov> = 1.24E-24 * 13.6^2 = 23E-23 m^3/s
t = 0.077 / 23E-23 = 3.3E19 = 10 trillion years.
In other words, in order to get any net energy out of a D-T reactor, you have to heat the reactants to 13.6 keV and hold them at that temperature for 10 trillion years.
Based on that result, I'm guessing this computation is not correct. Can anyone show what the correct result would be?
Thanks!
In trying to decide whether these guys were quacks, I started think about whether it would be possible in a pulsed proton-boron reactor to get more energy out than you put into heat the reactants. I notice these companies have been discussed here before, and since I'm an EE by training, not a physicist, I thought I would post my thoughts to get some feedback.
It seems a critical parameter is the fraction of fuel ions you can get to react in each cycle. If none react, all the energy you put into heat the fuel is wasted. If all of them react, you have saved the world. Somewhere in between is a break even point.
Let's suppose that the fraction of fuel ions that react is f. For proton-boron fusion, you have to heat the fuel to approximately 1E9 Kelvin, or 120 keV in order for fusion to occur (all parameters taken from http://en.wikipedia.org/wiki/Nuclear_fusion) . Its my understanding that a temperature of 120 keV means the average energy of the ions is 120 keV, and I'm assuming that means you have to add 120 keV of energy per ion in order to achieve that temperature.
Let's generously suppose you can recover 50% of the heat energy you put in. In addition, for each ion that reacts, you get 8.7/2 MeV out (8.7 MeV per 2 ions, one hydrogen ion and one boron ion).
Therefore, the total energy input is
Ein = 120 kEv per ion
and the total energy out is
Eout = Ein/2 + f * 8.7/2 MeV per ion
At breakeven Ein = Eout, or
f * 4.25 MeV = 60 keV
f = 1.4%
As I understand it, f is related to temperature, density, time and the reaction "cross section". Again from wikipedia, the reaction rate r = n1 * n2 * <ov> where n1 and n2 are the reactant densities and <ov> is an average of the ion velocities over the reaction cross section. At 120 keV, <ov> for proton-boron fusion is 3E-27 * 120^2 = 43E-24 m^3/s.
The fraction of fuel burned f = t * r / (n1 * n2) or simply t * <ov>. In order to burn 1.4% of the ions, you would have to sustain the reaction for
t = 1.4% / 43E-24 = 3.3E20 s ~= 103 trillion years.
Basically, what this computation says is that in order to get any net energy out of a proton-boron reactor, you have to heat the reactants to 120 keV and hold them at that temperature for 103 trillion years.
In comparison, the computations for deuterium-tritium are:
Ein = 13.6 keV per ion
Eout = Ein/2 + f * (3.5 + 14.1)/2 MeV per ion
f = 0.077 %
<ov> = 1.24E-24 * 13.6^2 = 23E-23 m^3/s
t = 0.077 / 23E-23 = 3.3E19 = 10 trillion years.
In other words, in order to get any net energy out of a D-T reactor, you have to heat the reactants to 13.6 keV and hold them at that temperature for 10 trillion years.
Based on that result, I'm guessing this computation is not correct. Can anyone show what the correct result would be?
Thanks!
Last edited by a moderator: