Why ITER is a Useless Investment: The Truth about Tritium and Tokamaks

  • Thread starter Enthalpy
  • Start date
  • Tags
    Iter
In summary, the conversation discusses the cost and usefulness of the tokamak fusion reactor, with the conclusion that it is currently not a practical or feasible solution for providing clean energy. The main point is that the production of tritium, a necessary component for the tokamak, is not readily available and would require large amounts of energy from fission reactors to produce. The conversation also mentions potential safety and environmental concerns with fusion reactors.
  • #1
Enthalpy
667
4
As the cost of ITER is now estimated at 16G€, I want to point out that such a tokamak is completely useless, because it requires unavailable tritium.

That is:
- Tritium does not exist naturally, because its life is about 20 years and it isn't produced on Earth by any process;
"in the ocean" is just false, by ignorance or by deception.
- Tritium is produced by uranium reactors, in tiny amounts. A fission reaction produces 200MeV heat to create less than one available neutron, which is necessary to produce one tritium atom, for instance from lithium. Then, one tritium consumed in a tokamak produces less than 20MeV heat. In other words, one 1GW tokamak needs >10GW fission reactors operating.
- Just as any magnetic confinement reactors, Tokamaks don't produce tritium. One reason: the D-T reaction produces only one neutron, and one neutron would produce less than one tritium, for instance from a lithium cover. Some would like to pretend that "neutron multiplicators" like lead achieve a tritium regeneration factor of 1.1 but this is a theoretical best case supposing there are no other design constraints on a tokamak... And well, there are design constraints, which in fact prevent doing anything more than keeping the plasma hot and confined...
- Tokamaks can't consume anything else than tritium in any foreseeable future. Other reactions than D-T, like D-D or D-Li, require conditions even much more difficult to achieve in the plasma. Nobody would predict a number of half-centuries more before these reactions are usable.

So:

- Tokamaks can't replace fission reactors, not even a small fraction of them.
- Tokamaks are useless. ITER is useless.
- We can save 16G€ worth of physicist time to develop useful and sensible projects, like geothermal energy, or like storage of wind electricity or Solar heat. We would have solved all of them with the money already wasted in tokamaks.
 
Engineering news on Phys.org
  • #2
This is a joke, right? Are you seriously comparing an experimental research reactor to a commercial power plant?

While it is true that commercial light water fission reactors produce a little bit of tritium, it is not harnessed in this way because it is far easier to breed tritium from either lithium or boron using a research reactor.

Furthermore, it's not as if you would be throwing away all those gigawatts of heat from the commercial-sized power reactor just to make tritium. Using your example (which is totally incorrect for several other reasons), you would be making >10GW of energy from fission, and get 1 GW of fusion fuel as a bonus on top of that.

Additionally, if you had a working practical tokamak, it would indeed produce its own tritium using the same method as above - the extra neutrons from fusion are absorbed in lithium to create more tritium.

So essentially, you are advocating cancelling research in unlimited clean power in the near future to have a couple gigawatts of unreliable, extremely expensive and inefficient "renewable" power now. Good call.
 
  • #3
Perfectly serious, and I stand by.

A research reactor would produce >10GW heat just as a power plant reactor would, when producing tritium for the 1GW tokamak. So the tokamak is useless, as it can't replace the fission reactors, not even a fraction of them.

Tokamaks can't produce their own tritium. Already explained in the first post, third item in the list: "extra neutrons" are mathematically too few.

-----

You add "clean" fusion power as an additional usual promise by proponents. This is false as well.

As fusion produces 1 neutron for 20MeV heat, instead of 3 neutrons for 200MeV heat for fission, the radioactivity induced in reactor materials would be 3 times higher than in fission reactors.

Or in fact much worse, because with the quite higher doses of neutrons and their much higher energy, the activation of surrounding materials gets frankly impossible to control. In fission reactors, avoiding some elements (like traces of cobalt in steel) limits the effects of neutron irradiation. But with spallation induced by energetic fusion radiation, as well as successive neutron absorption, such measures get inefficient.

-----

So research into tokamaks is not only very expensive and long. At some point, we'll realize they are dirty and we have no tritium to feed them.

Normal and sound management would require to solve the tritium impossibility before wasting any cent in this huge and meaningless enterprise, and abandon it if it can't be solved.

-----

QuantumPion's last sentence is the usual argument directed at a less informed public and shouldn't need an answer in a science forum, does it?
 
  • #4
Enthalpy said:
Perfectly serious, and I stand by.

A research reactor would produce >10GW heat just as a power plant reactor would, when producing tritium for the 1GW tokamak. So the tokamak is useless, as it can't replace the fission reactors, not even a fraction of them.

No it wouldn't. A research reactor is not a commercial power reactor. You do not need a 10 GW thermal LWR to create a neutron source to breed tritium. You would never do this anyway, as a water-cooled power reactor is designed specifically to make efficient use of neutrons and not letting them be absorbed in water, where as a research reactor's purpose is to create lots of extra neutrons for whatever secondary purpose is required.

Even if you did want to use a commercial power reactor to also breed tritium, you would not be throwing away all of the thermal energy the reactor produces to create that tritum. The thermal power would still be fully utilized to generate electricity, regardless of whether you are also making tritium or not. You would end up with all of the electricity generated by the fission reactor, in addition to the fusion fuel you created for use in a fusion reactor. While in a fusion reactor much of the nuclear energy is carried away by the neutrons, in fission reactors only a few percent is as power reactors are designed to not waste neutrons in the first place.

Anyways, the ITER could not replace a commercial nuclear power plant any more than the large hadron collider could because it is a research reactor that is not designed or intended to be capable of producing net electric power.

Enthalpy said:
Tokamaks can't produce their own tritium. Already explained in the first post, third item in the list: "extra neutrons" are mathematically too few.

I'm not a fusion engineer so I don't know all of the engineering challenges related to tokamak tritium breeding. However I do know that it is at least physically possible and a quick google search on the matter lists lots of different white papers on the the issue. So if you would provide a link to your source proving that it is in fact impossible, that would be helpful.

Enthalpy said:
Perfectly serious, and I stand by.
You add "clean" fusion power as an additional usual promise by proponents. This is false as well.

As fusion produces 1 neutron for 20MeV heat, instead of 3 neutrons for 200MeV heat for fission, the radioactivity induced in reactor materials would be 3 times higher than in fission reactors.

Or in fact much worse, because with the quite higher doses of neutrons and their much higher energy, the activation of surrounding materials gets frankly impossible to control. In fission reactors, avoiding some elements (like traces of cobalt in steel) limits the effects of neutron irradiation. But with spallation induced by energetic fusion radiation, as well as successive neutron absorption, such measures get inefficient.

This is entirely inaccurate. You are ignoring the entire chain of fission products produced by that single fission event, the radioactivity is by no means limited to just the neutrons produced. One fission event can lead to dozens of radioactive atoms down the line with half lives in the tens of thousands of years.

Commercial fission reactors generate hundreds of metric tons of spent fuel which is extremely radioactive and requires reprocessing or geological repository to get rid of. This radioactive waste is composed of a mix of extremely-long-lived actinides and fission products with half-lives in the ~1-10,000 year range.

By comparison, neutron activation of structural material produces radioisotopes of light elements which have short half lives. A fission reactor produces some of these due to the components that make up the fuel rods, as well as the reactor vessel itself. A fusion reactor would produce more, however this is far less of a deal then the spent fission fuel itself is as the decommissioning of a fusion reactor would not require any type of deep-geological repository.

I don't know where you get the idea that neutron activation is "impossible to control". It is in fact trivially easy to control, and is done so at every nuclear fission power plant in operation. Your last paragraph is entirely baseless and without merit or source.

Enthalpy said:
So research into tokamaks is not only very expensive and long. At some point, we'll realize they are dirty and we have no tritium to feed them.

Normal and sound management would require to solve the tritium impossibility before wasting any cent in this huge and meaningless enterprise, and abandon it if it can't be solved.

-----

QuantumPion's last sentence is the usual argument directed at a less informed public and shouldn't need an answer in a science forum, does it?

One of the goals of the ITER project is to test the capability of a tokamak to breed tritium. As the primary source of fusion fuel will be the fusion plants themselves, how do you propose we "solve the problem" of not having enough tritium before doing the research to solve the problem of not having enough tritium? You are suggesting that we should cancel research in the project because the project has not been completed successfully yet. It is ridiculous circular logical fallacy.
 
  • #5
Simple google search disagrees with your analysis:

"With three-dimensional modeling and neutron transport analysis, a tokamak with a low technology blanket containing beryllium was found to have a tritium breeding ratio of 1.54 tritons per DT neutron. Such a device would have a net tritium production capability of 9.1 kg/yr from 450 MW of fusion power at 70% capacity factor."
From the abstract of: Tritium breeding analysis of a tokamak magnetic fusion production reactor
Found at:
http://www.springerlink.com/content/n72nx03872g59356/

Heavy water reactors, such as CANDU, produce more tritium than light water reactors per GW heat produced. They are also much more neutron efficient and could be designed to produce more tritium.

Tritium can also be made from lithium in both fast and thermal reactors. Research reactors designed for irradiation can also be designed with higher flux to power ratio for example SLOWPOKEs.
 
  • #6
Do you expect something magic in the number of neutrons available from fission, just by choosing the type of reactor?

By its very nature, one 235U gives 200MeV and about 2.5 neutrons (slightly more with fast neutrons), of which one is consumed by the next reaction of the sustained chain, leaving a maximum of 1.5 neutron available for the production of radioisotopes, in this case tritium.

Research reactors try to make most of these 1.5 neutrons available while power plants don't. But this 1.5 is the maximum. Add many big losses meanwhile, and you get in fact much more than 10GW heat from fission when consuming the tritium to make 1GW fusion power.

-----

Of course, these 10GW would be used to produce electricity! But then, these uranium reactors being still necessary would just produce 11GW, instead of squandering 16G€ to make the last 1GW!
 
  • #7
I wrote precisely "radioactivity induced in reactor materials" and am happy to see you agree with me. You introduced the actinides and the fission products in the discussion, I didn't.

And I wrote it because, as usual, someone claimed fusion would be clean, which it isn't.

-----

This induced radioactivity is quite different from (and worse than) the one known in fission reactors, both from dose and energy. Please read again my input.
 
  • #8
"The capability of Tokamak to breed tritium... has not been completed successfully yet"
-> More accurately, this insurmountable flaw had been concealed up to recently!
In fact, this demo was added on ITER only because some specialists had raised the objection.

ITER is to have a small piece of demonstrating thing intended to show some tritium production but doesn't even intend to breed more tritium than it consumes. Which will prove nothing, because production is already known by other means - it's all a matter of quantity, or breeding more than it consumes.

Claimed breeding ratios suppose the whole cavity to be covered with some special and pure material to get a breeding factor slightly over 1. But then, tokamaks put some other design constraints on their walls, you know? Like resistance to temperature and neutron flux, vacuum cleanliness even when hot, magnetic and electric properties... These constraints alone are hardly met now, with designers hoping graphite-graphite as the material that may perhaps fit.

And as this objection looks insurmountable, it shall be treated first, before squandering huge amounts.
 
  • #9
Enthalpy said:
That is:
- Tritium does not exist naturally, because its life is about 20 years and it isn't produced on Earth by any process;
"in the ocean" is just false, by ignorance or by deception.
- Tritium is produced by uranium reactors, in tiny amounts. A fission reaction produces 200MeV heat to create less than one available neutron, which is necessary to produce one tritium atom, for instance from lithium. Then, one tritium consumed in a tokamak produces less than 20MeV heat. In other words, one 1GW tokamak needs >10GW fission reactors operating.
- Just as any magnetic confinement reactors, Tokamaks don't produce tritium. One reason: the D-T reaction produces only one neutron, and one neutron would produce less than one tritium, for instance from a lithium cover. Some would like to pretend that "neutron multiplicators" like lead achieve a tritium regeneration factor of 1.1 but this is a theoretical best case supposing there are no other design constraints on a tokamak... And well, there are design constraints, which in fact prevent doing anything more than keeping the plasma hot and confined...
- Tokamaks can't consume anything else than tritium in any foreseeable future. Other reactions than D-T, like D-D or D-Li, require conditions even much more difficult to achieve in the plasma. Nobody would predict a number of half-centuries more before these reactions are usable.
This is essentially incorrect, even to the point of misinformation!

Various folks are looking to extract deuterium from seawater, not tritium. Deuterium is considered an abundant fuel for the future. Wonder from where deuterium is extracted for CANDU reactors?

Of the 200 MeV from fission, about ~5 or so MeV is coming from the fast neutrons (~ 2 MeV/neutron on average), while the rest comes from fission products (~165 MeV), betas, and gamma radiation.

Fission reactors use 'burnable absorbers', e.g., gadolinia, erbia, or boron to absorb neutrons in order to control excess reactivity and power distribution in the core. If one were to introduce Li-6 as a burnable absorber, then one simply reduces a neutronically equivalent amount of the other absorbers. So using Li-6 to make tritium is not a disadvantage per se, but it does produce a limited amount of tritium.

There are various schemes for using fusion neutrons. The 14.1 MeV neutron can be slowed in a blanket and most of that thermal energy would be recovered before the neutron is absorbed - preferentially by Li-6 to make more T, or by U-238 or Th-232 to make fissile material. However, using fusion reactors to breed Pu-239 or U-233 is considered politically incorrect from a proliferation standpoint.

There is also the potential for (n, 2n) reactions in the blanket.

And who is the "Some [who] would like to pretend that "neutron multiplicators" like lead . . ."?


Tokamaks can't produce their own tritium. Already explained in the first post, third item in the list: "extra neutrons" are mathematically too few.
Not true.
d + d => t + p (~50%), He3 + n (~50%). Otherwise neutrons are used to produce tritium via Li6(n,α)T, which is also a reaction that can be applied in a fission reactor.

As fusion produces 1 neutron for 20MeV heat, instead of 3 neutrons for 200MeV heat for fission, the radioactivity induced in reactor materials would be 3 times higher than in fission reactors.
Again, 3 neutrons to do not produce 200 MeV. A single neutron causes a fission while the remaining neutrons are absorbed in the fuel or structural matierals.

There are significant challenges to materials in fusion reactors, and certain d+d or d+He3, or d+Li require more challenging confinement conditions, or perhaps more challenging feed and bleed processes.

ITER is not necessarily optimally configured for a commerical system.
 
  • #10
Enthalpy said:
- Tritium does not exist naturally, because its life is about 20 years and it isn't produced on Earth by any process;

BALONEY! More misinformation / disinformation due to poor scholarship on the
part of this poster.

There IS a natural process that creates Tritium. One of the constituents of the radiation
from the Sun that we call the "solar wind" is fast neutrons. Those fast neutrons interact
with the ordinary Nitrogen in our atmosphere giving the following reaction.

7N14 + 0n1 --> 6C12 + 1T3

See:

http://en.wikipedia.org/wiki/Tritium

This natural process creates Tritium high in the atmosphere. The Tritium combines
with Oxygen and rains to the ground as slightly tritiated water. All water on the planet
is slightly radioactive due to the presence of natural Tritium.

Dr. Gregory Greenman
 
Last edited:
  • #11
Morbius said:
... One of the constituents of the radiation
from the Sun that we call the "solar wind" is fast neutrons. Those fast neutrons interact with the ordinary Nitrogen in our atmosphere giving the following reaction...
I don't have a dog in this fight, but I am disconcerted. Please elaborate in the following context.
1: The half-life of a free neutron is about 10 minutes, giving a mean lifetime of some 14 minutes.
2: A solar photon takes about 8 minutes to reach us; how many half-lives would it take a 4MeV neutron to reach us? (I do realise that it takes a lot of half-lives to get rid of all the neutrons!)
3: Given your response to those questions, are solar neutrons (especially fast neutrons) still a major source of isotopes in our atmosphere?

Thanks if you can help.
Jon
 
  • #12
Jon Richfield said:
I don't have a dog in this fight, but I am disconcerted. Please elaborate in the following context.
1: The half-life of a free neutron is about 10 minutes, giving a mean lifetime of some 14 minutes.
2: A solar photon takes about 8 minutes to reach us; how many half-lives would it take a 4MeV neutron to reach us? (I do realise that it takes a lot of half-lives to get rid of all the neutrons!)
3: Given your response to those questions, are solar neutrons (especially fast neutrons) still a major source of isotopes in our atmosphere?

Thanks if you can help.
Jon

Cosmic ray spallation produces Tritium in our atmosphere.
 
  • #13
GiftOfPlasma said:
Cosmic ray spallation produces Tritium in our atmosphere.
Thanks GP; that I have no difficulty with.
Cheers,
Jon
 
  • #14
A D-T fusion tokamak power reactor will produce its own tritium in-situ through transmutation of the lithium tritium-breeding blanket, as I'm sure most of you know.

Nobody denies that Teller-Ulam bombs fuelled with LiD (and very little, if any, 3H initially inside the weapon) work very effectively, do they?
 
  • #15
What is the point of fusion?

Is it simply that fusion is more politically acceptable than fission? Let's say we could be realistic about nuclear. Is the radioactive waste (e.g., activated shielding) from a tokamak honestly so much better to deal with than that from a fission station? (I thought both were trivial to deal with, especially compared with radioactive coal ash.) Is there a genuine shortage of fission fuel that fusion will overcome? Do we expect fusion to work out economically superior to fission and to, say, solar thermal? Is it safer than fission (can I presume that both are safer than solar, and far safer than coal)?
 
  • #16
cesiumfrog said:
What is the point of fusion?

Is it simply that fusion is more politically acceptable than fission? Let's say we could be realistic about nuclear. Is the radioactive waste (e.g., activated shielding) from a tokamak honestly so much better to deal with than that from a fission station? (I thought both were trivial to deal with, especially compared with radioactive coal ash.) Is there a genuine shortage of fission fuel that fusion will overcome? Do we expect fusion to work out economically superior to fission and to, say, solar thermal? Is it safer than fission (can I presume that both are safer than solar, and far safer than coal)?

Reprocessing spent fission fuel is definitely not trivial, it requires very expensive facilities and safeguards. Fusion has none of those drawbacks. Furthermore, while fission power is very safe, as history has shown it is not foolproof. Fusion has no risk of a catastrophe such as Chernobyl/TMI/SL-1/Windscale occurring. Also there is no proliferation concern with fusion fuel.

Once the engineering technology to make fusion practical is developed, I believe it will be the most effective source of power for humanity. Renewable sources such as solar or wind can never be economical, the power density is just too low.

As a little aside: while fission fuel is quite plentiful, it is not as unlimited and ubiquitous as fusion fuel. And once fission fuel is used up, it is gone forever and there is no way to replenish it, since it can only be created in supernovas. My conjecture is that humanity may be better served by saving as much fission fuel as we can for the distant future for space travel applications, where its high power density may be irreplaceable by other sources.
 
  • #17
Frankly QP, although I am a slightly sceptical supporter of fusion power, I certainly am a supporter. I do not think that the things you said should need saying. As humans we should be investigating whatever we reasonably can investigate and in particular whatever might reasonably be expected to improve our position relative to nature. Such investigations would include both the academic and the possibly applicable.
Fusion research meets both criteria. How many lines of fusion research we should be investigating apart from Tokamak is a moot point. I would prefer to see several more, given that there are quite a few ideas that look promising.
But that is a matter of detail.
Go well,
Jon
 
  • #18
Several years ago it seems public information/talks about plasma fusion power (Tokamaks, Stellarators, or?) at Columbia University just stopped. The rumor was there was some kind of realization or breakthrough that meant the reaction was easy to achieve and could even be used for a bomb. The only person I knew there said it was now classified and he wouldn't talk about it. Does anybody have any idea what this is about or is it just baloney? (I thought something like this might be possible with opposing neutral beam heaters, but that's probably not it.)
 
  • #19
Bernie G said:
Several years ago it seems public information/talks about plasma fusion power (Tokamaks, Stellarators, or?) at Columbia University just stopped. The rumor was there was some kind of realization or breakthrough that meant the reaction was easy to achieve and could even be used for a bomb. The only person I knew there said it was now classified and he wouldn't talk about it. Does anybody have any idea what this is about or is it just baloney? (I thought something like this might be possible with opposing neutral beam heaters, but that's probably not it.)
Fusion power is based on a controlled fusion process with a moderate power density.

Thermonuclear (fusion) weapons are based on a short term (microsecond) process that it is initiated with a fission trigger.

The two processes are very different, as is a nuclear power plant and conventional fission warhead.

So far controlled thermonuclear (fusion) for the commercial production of electrical energy has proven elusive.
 
  • #20
9 scientists in a room... 10 different opinions.

I agree every aspect of fusion should be fully funded and researched.

If only we could generate muons more efficiently! (And make them a more efficient catalyst). It's always been my imagination that real working fusion power is going to have to use all the advantages of each way to generate fusion... and somehow eliminate all the drawbacks of each of these methods.

My greatest fear is that ITER will fizzle out (pun intended) and research dollars for fusion will dry up faster that a puddle of heavy water.
 
  • #21
Well, I'm not an expert, but the OP does have a point. Although there is a natural abundance of tritium in nature because of cosmic radiation, it is of the order of 10^(-17) or something, in other words, totally inexploitable as a fuel source for commercial energy production.

As I understand the OP, his point is the following: given that tritium has to come from an artificial source, usually neutron bombardment of Lithium, there is a bookkeeping problem:

a) one fusion reaction consumes one tritium nucleus, and produces one neutron.

b) if we have only a tritium production of one tritium atom per neutron, this is never going to achieve auto-refueling.

c) if a serious fraction of the to be burned tritium has to come from another source, nuclear fission, then we have the following problem:

c1) nuclear fission liberates ~ 200 MeV and produces of the order of 2.5 neutrons per fission, of which 1 neutron is going to be used to sustain the fission chain, so at most 1.5 neutrons are available for doing something with, in the optimal case, producing a tritium.

This means that in order to produce a single tritium atom in a nuclear facility, this facility needs to liberate about 130 MeV of power (200 MeV / 1.5). This power can of course be used for, say electricity production, BUT the facility will need to liberate it. Note that we use the totally ridiculous hypothesis of having all neutrons that do not give rise to fission, be making tritium. This is impossible. So in reality we will have to liberate much more nuclear power in order to have a single neutron be absorbed and form tritium.

However, with this single tritium atom, we can only produce something like 20 MeV in a fusion reactor.

If all of the tritium for a fusion reactor were produced in a nuclear reactor, the whole fusion proposition becomes ridiculous:

In order to have a fusion power plant of 2 GW thermal, one needs to operate 13 GW of thermal power in fission power plants. In other words, this fusion stuff is a meager "booster" of fission power.

Of course, thanks to the neutron from the fusion reaction itself, one can still produce SOME tritium, but not as much as there is consumed, if one neutron makes one tritium.

Let us say that one neutron from fusion will produce, on average, 0.8 tritium (and 0.2 is lost somewhere in the structure). Then this comes down to needing one external tritium per 4 internally produced tritiums to keep the reactor going.

So we STILL need more power (13 / 5 = 2.6 GW) of fission plants than we can have fusion plants, if the blacket has a production rate of 0.8.

So the OP has a point. However, the point is also that with FAST neutrons, there is another tritium production reaction: Li-7 + n --> He-4 + T + n

This is a way to get more tritium out of lithium than one has neutrons.

But the OP is far from stupid: tritium production seems indeed to be yet another bottleneck to commercial fusion power, and the blanket needs to achieve self-sufficiency if this power source doesn't want to be ridiculous, because needing a bigger fission power station next to it than it can deliver itself.

As I said, I'm not an expert, and I don't know the state of the art of this regeneration process in a fusion reactor blancket.
 
  • #22
I'm not an expert either, but since France is the main sponsor (they practically live on nuclear, thus should be the most motivated by the issue), I don't see why anyone should complain. Now other countries can choose to collaborate or not to. Each share is still significantly less costly than many well-known space programs that also have little tangible returns.

I think the most practical and immediate sources of energy are within the private industries reach.

Fusion should be seen as a large-scale academic/research program. These all need governments.
 
  • #23
I think that only D+D fusion reactions will be justified in long term perspective.
If people will irreversibly burn such a useful and rare metal as Lithium in nuclear rections,
it will be another major environmental stupidity which may shatter life and industry forever.

Ther is no secret that fusion power on industrial scale is highly speculative for now.There might be some other approaches that might overperform tokamaks,for example laser inertial fusion.
 
  • #24
Well, there as of now already another serious problem that indicates this tritium difficulty: there is now a world-wide scarcity of He-3. He-3 is used in cryogenics, but also in neutron detection, and there's a world-wide problem with it.
Well, He-3 is the decay product of tritium. If you have enough tritium, there isn't any He-3 supply problem. So the fact that there is now such a He-3 problem indicates that tritium is a very scarce resource.
 
  • #25
vanesch said:
Well, I'm not an expert, but the OP does have a point. Although there is a natural abundance of tritium in nature because of cosmic radiation, it is of the order of 10^(-17) or something, in other words, totally inexploitable as a fuel source for commercial energy production.

As I understand the OP, his point is the following: given that tritium has to come from an artificial source, usually neutron bombardment of Lithium, there is a bookkeeping problem:

a) one fusion reaction consumes one tritium nucleus, and produces one neutron.

b) if we have only a tritium production of one tritium atom per neutron, this is never going to achieve auto-refueling.

c) if a serious fraction of the to be burned tritium has to come from another source, nuclear fission, then we have the following problem:

c1) nuclear fission liberates ~ 200 MeV and produces of the order of 2.5 neutrons per fission, of which 1 neutron is going to be used to sustain the fission chain, so at most 1.5 neutrons are available for doing something with, in the optimal case, producing a tritium.

This means that in order to produce a single tritium atom in a nuclear facility, this facility needs to liberate about 130 MeV of power (200 MeV / 1.5). This power can of course be used for, say electricity production, BUT the facility will need to liberate it. Note that we use the totally ridiculous hypothesis of having all neutrons that do not give rise to fission, be making tritium. This is impossible. So in reality we will have to liberate much more nuclear power in order to have a single neutron be absorbed and form tritium.

However, with this single tritium atom, we can only produce something like 20 MeV in a fusion reactor.

If all of the tritium for a fusion reactor were produced in a nuclear reactor, the whole fusion proposition becomes ridiculous:

In order to have a fusion power plant of 2 GW thermal, one needs to operate 13 GW of thermal power in fission power plants. In other words, this fusion stuff is a meager "booster" of fission power.

Of course, thanks to the neutron from the fusion reaction itself, one can still produce SOME tritium, but not as much as there is consumed, if one neutron makes one tritium.

Let us say that one neutron from fusion will produce, on average, 0.8 tritium (and 0.2 is lost somewhere in the structure). Then this comes down to needing one external tritium per 4 internally produced tritiums to keep the reactor going.

So we STILL need more power (13 / 5 = 2.6 GW) of fission plants than we can have fusion plants, if the blacket has a production rate of 0.8.

So the OP has a point. However, the point is also that with FAST neutrons, there is another tritium production reaction: Li-7 + n --> He-4 + T + n

This is a way to get more tritium out of lithium than one has neutrons.

But the OP is far from stupid: tritium production seems indeed to be yet another bottleneck to commercial fusion power, and the blanket needs to achieve self-sufficiency if this power source doesn't want to be ridiculous, because needing a bigger fission power station next to it than it can deliver itself.

As I said, I'm not an expert, and I don't know the state of the art of this regeneration process in a fusion reactor blancket.

You are missing a crucial point here, the same as the OP. Fission reactors have a huge excess of neutrons up until the very end of the cycle. This is why boron and other burnable poisons are used to control the reactivity. The neutrons that are absorbed by boron are basically wasted. If you were to use tritium-breeding inserts in the reactor, this would merely absorb neutrons that would otherwise have been wasted anyway. You would simply use a lower boron concentration. The impact on the fuel economics would be negligible to non-existent. Your calculations regarding the energy used by a fission plants to fuel fusion plants is flawed at the fundamental level. Thermal energy produced by a fission plant does not magically disappear just because you are breeding some tritium as a bonus.
 
  • #26
QuantumPion said:
You are missing a crucial point here, the same as the OP. Fission reactors have a huge excess of neutrons up until the very end of the cycle. This is why boron and other burnable poisons are used to control the reactivity. The neutrons that are absorbed by boron are basically wasted. If you were to use tritium-breeding inserts in the reactor, this would merely absorb neutrons that would otherwise have been wasted anyway. You would simply use a lower boron concentration. The impact on the fuel economics would be negligible to non-existent. Your calculations regarding the energy used by a fission plants to fuel fusion plants is flawed at the fundamental level. Thermal energy produced by a fission plant does not magically disappear just because you are breeding some tritium as a bonus.
It's already being done.
 
  • #27
Astronuc said:
It's already being done.

Yes, I asked a colleague of mine about this and was told that the DoE has in the past contracted with power reactors to breed tritium using poison inserts, presumably for weapon stockpile purposes. So I imagine it would be feasible to do so on a larger scale for fusion fuel breeding as well.
 
  • #28
QuantumPion said:
You are missing a crucial point here, the same as the OP. Fission reactors have a huge excess of neutrons up until the very end of the cycle. This is why boron and other burnable poisons are used to control the reactivity. The neutrons that are absorbed by boron are basically wasted. If you were to use tritium-breeding inserts in the reactor, this would merely absorb neutrons that would otherwise have been wasted anyway. You would simply use a lower boron concentration. The impact on the fuel economics would be negligible to non-existent. Your calculations regarding the energy used by a fission plants to fuel fusion plants is flawed at the fundamental level. Thermal energy produced by a fission plant does not magically disappear just because you are breeding some tritium as a bonus.

I think YOU are missing the point of the OP (which I am repeating): nobody is saying that the fission energy is LOST ; it is just that you need a fission plant of comparable or larger size than the fusion reactor it would provide with fuel. BTW, there IS a solution to the problem, which people ARE working on, which is a self-sufficient breeding blanket, but it needs to make more than one tritium atom with a single neutron.

Let us repeat the reasoning:

You have, in a fission plant, a fission of a single U-235 nucleus. This fission will:
a) liberate in the end about 200 MeV, energy which will go into the production of thermal energy to be converted partially in electricity IN THE FISSION PLANT.
b) liberate on average 2.5 neutrons.

Now, what can we do with those 2.5 neutrons ?
We have to sustain the chain reaction, so for each fission of U-235, we will need to fission another U-235 and hence use up precisely one neutron for this (this is the famous k-factor which is equal to 1 in a sustained fission chain reaction). So of our 2.5 neutron budget, we loose 1.
In a normal reactor, the regulation bars, the boron in the water and the structure itself (just as well as the fuel) make up for the capture of these 1.5 neutrons, but in a crazily idealised reactor, we could use these 1.5 neutrons, ALL of them, to be captured by Li-6 and make tritium. This means we make, for each fission of a U-235 atom, 1.5 tritium atoms.

In fact, we all know that is too much of an idealisation, and will not work with a thermal reactor, for the simple reason that a thermal reactor cannot be a breeder (and if we replaced the Li-6 with U-238 in our example, we would have a thermal breeder reactor with a breeding ratio of about 1.5). But let's keep with our "upper limit" scenario.

So what we have, is that in an idealised scenario, a reactor that has dissipated 200 MeV of heat because of a single U-235 fission reaction, has produced 1.5 tritium atoms.

With these 1.5 tritium atoms, we can have, in a fusion reactor, 1.5 fusion reactions of the kind T - D. A single fusion reaction liberates 17.6 MeV, let us be nice and say that it liberates 20 MeV. So 1.5 fusion reactions will liberate 30 MeV at most.

So what we have is that a fission power plant that produced 200 MeV of thermal fission energy (which can be converted into electricity with the necessary thermodynamic losses) can provide enough tritium to power a fusion power plant that provides 30 MeV of thermal fusion energy (which can be converted in a similar way into electricity).

So without any breeding blanket, fusion power is INDEED ridiculous, because for every 20 GW of fission power plants, you can also have a single 3 GW fusion power plant operating.

I'm NOT saying (and the OP is NOT saying) that these 20 GW of fission power is LOST. We are just saying that fusion is not going to REPLACE fission power in this case, as you have to build almost 10 times more power plants with fission power, than you can build fusion power plants that use up the thus-generated tritium.

Even if you can have a blanket that is 80% efficient, that is to say, can convert, for every 10 fusion reactions, and so 10 fusion neutrons, 8 Li-atoms into tritium, you STILL have to have MORE fission power plants than fusion power plants.

It is ABSOLUTELY NECESSARY for fusion NOT to be ridiculous, that the blanket is self-sufficient, and that's a difficult task, because you need to make more than one tritium atom with a single neutron. It looks as if you had to build a fission chain reaction if the neutron yield of a single fission were 1 and not 2.5.
 
Last edited:
  • #29
QuantumPion said:
Yes, I asked a colleague of mine about this and was told that the DoE has in the past contracted with power reactors to breed tritium using poison inserts, presumably for weapon stockpile purposes. So I imagine it would be feasible to do so on a larger scale for fusion fuel breeding as well.

The problem with words like "huge" is that they do not allow for a detailed balance.

In a fission reactor, a single fission liberates about 2.5 neutrons. 1 of these neutrons is necessary to sustain the chain reaction. So this "huge amount" of lost and absorbed neutrons is 1.5 per fission.

In other words, in a fission reactor, you have a huge flux of neutrons that are to be absorbed, but you also have a huge number of fission reactions, and you have a huge amount of liberated fission energy, and at the end of the day, you find that:

"huge amount of fission energy" / "huge amount of neutrons wasted" = 200 MeV / 1.5 = 130 MeV

No matter how.

So for each 1.5 neutrons "wasted" that could eventually, potentially be turned into a tritium producing reaction, your reactor has to liberate 130 MeV of fission energy (which you can use at your good will, for instance to make electricity).
 
  • #30
Another (potentially ridiculous) solution to the problem would be to produce tritium with a spallation source. I didn't do the calculation, but intuitively I'm fairly confident that producing tritium with a spallation source is going to consume more electricity than you can gain by burning the tritium in a fusion reactor.
 
  • #31
In other words, in a fission reactor, you have a huge flux of neutrons that are to be absorbed, but you also have a huge number of fission reactions, and you have a huge amount of liberated fission energy, and at the end of the day, you find that:

"huge amount of fission energy" / "huge amount of neutrons wasted" = 200 MeV / 1.5 = 130 MeV
This is not how it goes.

The fission process generates about 200-205 MeV per fission. Of that, about 4-5 MeV are carried away by 2 or 3 neutrons (on average about 2.3-2.4 n). About 160-170 MeV are released as kinetic energy of the two major fission products, radionuclei, e.g., Te, I, Xe, Cs, Ba, Lu, REs. . . and As, Se, Br, Kr, Rb, Sr, Y, Zr, Nb . . . . Additional energy is released from beta decay, prompt gammas, and decay gammas, and delayed neutrons (about 0.6% of neutrons). It is the delayed neutrons that allow for control of the nuclear reaction.

In LWRs, the fast neutrons must be thermalized (slowed down) from 1-2 MeV to ~0.025 eV, which is what hydrogen in water does quite well.

One fission neutron must survive to cause another fission. The remaining neutrons are absorbed by the coolant (H + n => D or D + n => T, but that's a very small fraction), by the structural material (steels and nickel alloys, and very little in Zr-based cladding), and by the fuel (U238 + n => U239 => Np239 => Pu239, or Np239 + n => Np240 => Pu240, and a host of other transuranic isotopes). In LWRs, about half the fissions in high burnup fuel are actually in Pu239 rather than the U235.

In (d+t) fusion, the neutron actually carries away a substantial portion of the energy (14.1 MeV of 17.6 MeV) and there is one neutron that must go somewhere - out of the fusion reactor plasma into the first wall or blanket surrounding the plasma. Ideally that neutron is captured by Li to produce more T for fusion, or it could be used for a fission reaction in a so-called fusion-fission hybrid.

d+t fusion reaction is used because it is the easiest with which to produce energy. Ideally d+d fusion would be used, if perfected, because D is much more plentiful than T, and it's not radioactive. But d+d reaction has a lower cross-section at a given temperature, and to achieve the same reaction rate, d+d plasmas must operate at a higher temperature (and pressure) than d+t plasmas.

d+d => p+t (~0.5) or n+He3 (~0.5). The t and He3 in the plasma may undergo d+t or d+He3. d+t => He4+n and d+He3 => He4 + p. Aneutronic reactions are nice because they don't produce neutrons, and so the energy goes into the charged particles which heat the plasma and which can ideally be extracted somehow.

The various concepts for fusion face the same problems but in different ways - namely how to extract useful energy from the fusion reaction and minimize the energy put into the plasma to maintain the conditions required for fusion.
 
  • #32
Astronuc said:
This is not how it goes.

The fission process generates about 200-205 MeV per fission. Of that, about 4-5 MeV are carried away by 2 or 3 neutrons (on average about 2.3-2.4 n). About 160-170 MeV are released as kinetic energy of the two major fission products, radionuclei, e.g., Te, I, Xe, Cs, Ba, Lu, REs. . . and As, Se, Br, Kr, Rb, Sr, Y, Zr, Nb . . . . Additional energy is released from beta decay, prompt gammas, and decay gammas, and delayed neutrons (about 0.6% of neutrons). It is the delayed neutrons that allow for control of the nuclear reaction.

In LWRs, the fast neutrons must be thermalized (slowed down) from 1-2 MeV to ~0.025 eV, which is what hydrogen in water does quite well.

Yes, I'm not contradicting this. I'm confirming all that. But you will agree with me that in total, in the reactor, about 200 MeV is released for a single fission, right ? It doesn't matter what particle carries away what energy, in total, about 200 MeV is released and finally converted to thermal power of the reactor, right ?

And of this single fission, initially, 2.5 neutrons were available, and, as you say, one is used to continue the chain reaction, so 1.5 neutrons go "elsewhere". It are THESE neutrons which are available for absorption, right ?
(the small amount of delayed neutrons and so on is not seriously going to alter the balance)

One fission neutron must survive to cause another fission. The remaining neutrons are absorbed by the coolant (H + n => D or D + n => T, but that's a very small fraction), by the structural material (steels and nickel alloys, and very little in Zr-based cladding), and by the fuel (U238 + n => U239 => Np239 => Pu239, or Np239 + n => Np240 => Pu240, and a host of other transuranic isotopes). In LWRs, about half the fissions in high burnup fuel are actually in Pu239 rather than the U235.

All this is correct but irrelevant to the issue...

What counts is that to have X neutrons available for absorption in a nuclear reactor, one needs to fission (X / 1.5) fuel atoms, be it U-235 or Pu-239 or even something else.

And each fission will liberate about 200 MeV of finally thermal energy.

If you want to have 1.5 10^25 neutrons absorbed, you will have to fission 10^25 atoms of U-235 or Pu-239 or whatever and probably more, because here we suppose ideally that ALL neutrons that are liberated by fission and are not fissioning another U-235 atom, are going to be usefully absorbed.

If you work with Li-6, with 1.5 10^25 neutrons absorbed, you can have 1.5 10^25 tritium atoms. So in order to produce 1.5 10^25 tritium atoms, you will have had to fission 10^25 fuel atoms.

10^25 fissions of fuel atoms will have liberated a total amount of energy equal to 10^25 times 200 MeV (or 3205 TJ). If this is done in a year's time, this comes down to 101 MW.

So in order to run a 15 MW thermal fusion reactor, one needs a 101 MW fission reactor next to it. That's ridiculous. It is not useless, but it is ridiculous to develop technology for that. Just make the fission plant somewhat bigger and you don't need no stinkin' fusion :smile:

Now, with those 1.5 10^25 tritium atoms, we can do 1.5 10^25 fusions, and hence liberate in a fusion reactor, 1.5 10^25 x 20 MeV which amounts to 15.2 MW if we use this fuel up during a year.

So a 102 MW fission reactor can provide in very ideal situations,enough tritium fuel for a 15.2 MW fusion reactor to run continuously, if the reactor isn't providing any tritium breeding by itself.

In (d+t) fusion, the neutron actually carries away a substantial portion of the energy (14.1 MeV of 17.6 MeV) and there is one neutron that must go somewhere - out of the fusion reactor plasma into the first wall or blanket surrounding the plasma. Ideally that neutron is captured by Li to produce more T for fusion, or it could be used for a fission reaction in a so-called fusion-fission hybrid.

Yes, but all that isn't doing anything to the issue.

d+t fusion reaction is used because it is the easiest with which to produce energy. Ideally d+d fusion would be used, if perfected, because D is much more plentiful than T, and it's not radioactive. But d+d reaction has a lower cross-section at a given temperature, and to achieve the same reaction rate, d+d plasmas must operate at a higher temperature (and pressure) than d+t plasmas.

Yes, so for the moment we would already be very happy by having a reactor run on D+T.

The point is that if with D+T fusion, one doesn't achieve self-sufficiency with a breeding blanket, it becomes, as a commercial power production mechanism, a ridiculous technique, and that was the OP's point. I think he's right.
 
  • #33
vanesch said:
Yes, I'm not contradicting this. I'm confirming all that. But you will agree with me that in total, in the reactor, about 200 MeV is released for a single fission, right ? It doesn't matter what particle carries away what energy, in total, about 200 MeV is released and finally converted to thermal power of the reactor, right ?
I was only objecting to ""huge amount of fission energy" / "huge amount of neutrons wasted" = 200 MeV / 1.5 = 130 MeV"

The point is that if with D+T fusion, one doesn't achieve self-sufficiency with a breeding blanket, it becomes, as a commercial power production mechanism, a ridiculous technique, and that was the OP's point. I think he's right.
Unless one adds in Be which can produce (n,2n) reactions, or a fission blanket. But that adds fissions to the system, which is at odds for purusing fusion as a replacement to fission.

In addition, Li, has become more critical to rechargeable batteries, and the prospect of consuming Li as a fuel may not work because the demand for other uses will increase it's value/cost.

Finally, to claim ITER is useless because we can't produce tritium in large quantities simply ignores the fact that d+d is the preferred reaction, but d+t is easier to accomplish. If d+t fusion can be successfully demonstrated, the d+d could work as well (maybe). Ideally, fusion would be based on aneutronic reactions - which isn't the case with the easiest reaction, and not even with d+d, in which about half the fusions produce neutrons.
 
  • #34
Astronuc said:
I was only objecting to ""huge amount of fission energy" / "huge amount of neutrons wasted" = 200 MeV / 1.5 = 130 MeV"

Nevertheless, it is correct :wink:

(by wasted, I meant "in a normal reactor, the amount of neutrons that are indeed "wasted" in the sense of absorbed in control elements, fuel but no fission (*), structure... and so are eventually, maximally, available for T production if we don't "waste" them).

The number of neutrons produced that can be wasted, say, to make T, will need at least a dissipation of 130 MeV of energy per neutron each.

Unless one adds in Be which can produce (n,2n) reactions, or a fission blanket. But that adds fissions to the system, which is at odds for purusing fusion as a replacement to fission.

Yes, that's the point. Actually, until I read the OP's post, I never realized how critical this blanket was. It will be quite difficult to achieve self-sufficiency, because of course you cannot capture the neutrons in 4 pi without any structural capture and loss. So you need a neutron multiplier.

Non-fission neutron multiplication isn't easy (apart from spallation). If it were, people would use it to make thermal breeders with uranium.
In addition, Li, has become more critical to rechargeable batteries, and the prospect of consuming Li as a fuel may not work because the demand for other uses will increase it's value/cost.

Indeed.

Finally, to claim ITER is useless because we can't produce tritium in large quantities simply ignores the fact that d+d is the preferred reaction, but d+t is easier to accomplish. If d+t fusion can be successfully demonstrated, the d+d could work as well (maybe).

Yes, of course, but when you look at the difficulties people have to realize self-sustained, energetically useful let alone commercially competitive energy from D + T (that's the hope in the second half of this century) even granted tritium provision, D + D is for the 22nd century at best.

I'm not saying ITER is useless, but the tritium bottleneck is yet another difficulty as compared to the rosey pictures of "soon, clean energy here", no ?

Fission seems the way to go for a long long time still. Research is never useless, you always learn something. ITER will be financed by 3 times a big mistake of a trader in a front office. That's reasonable... :smile:
 
Last edited:
  • #35
vanesch said:
The number of neutrons produced that can be wasted, say, to make T, will need at least a dissipation of 130 MeV of energy per neutron each.
But this is not correct. One neutron causes a fission which produces 200 MeV. The additional neutrons would carry away only 2-4 MeV. They could be absorbed in special assemblies to produce tritium, which is produced in the coolant through the (n, alpha) reaction with Li anyway. There is no 130 MeV being carried away by the extra neutrons from fission. Anyway, a substantial fraction of the extra neutrons are absorbed the fuel (U-238) which is converted eventually to fissile Pu239 and (Pu-240, Pu-241), Am241, Cm244, and other TRUs.

BTW - thorium (Th-232) with U-233 is the basis of a thermal breeder reactor.
 

Similar threads

  • Nuclear Engineering
Replies
9
Views
2K
Replies
2
Views
1K
  • Nuclear Engineering
Replies
7
Views
2K
  • Nuclear Engineering
Replies
5
Views
4K
  • Nuclear Engineering
2
Replies
46
Views
12K
Replies
25
Views
7K
Replies
4
Views
3K
  • Nuclear Engineering
Replies
23
Views
8K
  • High Energy, Nuclear, Particle Physics
Replies
13
Views
5K
  • High Energy, Nuclear, Particle Physics
Replies
5
Views
1K
Back
Top