Why ITER is a Useless Investment: The Truth about Tritium and Tokamaks

  • Thread starter Thread starter Enthalpy
  • Start date Start date
  • Tags Tags
    Iter
Click For Summary
The discussion critiques the ITER project, arguing that tokamaks are ineffective due to their reliance on tritium, which is not naturally available and is produced in insufficient quantities by fission reactors. It highlights that a tokamak cannot generate its own tritium and requires an impractical amount of energy from fission reactors to operate. The participants express skepticism about the potential for tokamaks to replace fission energy, asserting that research funds would be better spent on alternative energy solutions like geothermal or wind storage. Concerns are raised about the long-term radioactivity and waste management issues associated with fusion reactors, contradicting claims of their cleanliness. Overall, the consensus is that ITER represents a misguided investment in an unproven and potentially flawed energy source.
  • #31
In other words, in a fission reactor, you have a huge flux of neutrons that are to be absorbed, but you also have a huge number of fission reactions, and you have a huge amount of liberated fission energy, and at the end of the day, you find that:

"huge amount of fission energy" / "huge amount of neutrons wasted" = 200 MeV / 1.5 = 130 MeV
This is not how it goes.

The fission process generates about 200-205 MeV per fission. Of that, about 4-5 MeV are carried away by 2 or 3 neutrons (on average about 2.3-2.4 n). About 160-170 MeV are released as kinetic energy of the two major fission products, radionuclei, e.g., Te, I, Xe, Cs, Ba, Lu, REs. . . and As, Se, Br, Kr, Rb, Sr, Y, Zr, Nb . . . . Additional energy is released from beta decay, prompt gammas, and decay gammas, and delayed neutrons (about 0.6% of neutrons). It is the delayed neutrons that allow for control of the nuclear reaction.

In LWRs, the fast neutrons must be thermalized (slowed down) from 1-2 MeV to ~0.025 eV, which is what hydrogen in water does quite well.

One fission neutron must survive to cause another fission. The remaining neutrons are absorbed by the coolant (H + n => D or D + n => T, but that's a very small fraction), by the structural material (steels and nickel alloys, and very little in Zr-based cladding), and by the fuel (U238 + n => U239 => Np239 => Pu239, or Np239 + n => Np240 => Pu240, and a host of other transuranic isotopes). In LWRs, about half the fissions in high burnup fuel are actually in Pu239 rather than the U235.

In (d+t) fusion, the neutron actually carries away a substantial portion of the energy (14.1 MeV of 17.6 MeV) and there is one neutron that must go somewhere - out of the fusion reactor plasma into the first wall or blanket surrounding the plasma. Ideally that neutron is captured by Li to produce more T for fusion, or it could be used for a fission reaction in a so-called fusion-fission hybrid.

d+t fusion reaction is used because it is the easiest with which to produce energy. Ideally d+d fusion would be used, if perfected, because D is much more plentiful than T, and it's not radioactive. But d+d reaction has a lower cross-section at a given temperature, and to achieve the same reaction rate, d+d plasmas must operate at a higher temperature (and pressure) than d+t plasmas.

d+d => p+t (~0.5) or n+He3 (~0.5). The t and He3 in the plasma may undergo d+t or d+He3. d+t => He4+n and d+He3 => He4 + p. Aneutronic reactions are nice because they don't produce neutrons, and so the energy goes into the charged particles which heat the plasma and which can ideally be extracted somehow.

The various concepts for fusion face the same problems but in different ways - namely how to extract useful energy from the fusion reaction and minimize the energy put into the plasma to maintain the conditions required for fusion.
 
Engineering news on Phys.org
  • #32
Astronuc said:
This is not how it goes.

The fission process generates about 200-205 MeV per fission. Of that, about 4-5 MeV are carried away by 2 or 3 neutrons (on average about 2.3-2.4 n). About 160-170 MeV are released as kinetic energy of the two major fission products, radionuclei, e.g., Te, I, Xe, Cs, Ba, Lu, REs. . . and As, Se, Br, Kr, Rb, Sr, Y, Zr, Nb . . . . Additional energy is released from beta decay, prompt gammas, and decay gammas, and delayed neutrons (about 0.6% of neutrons). It is the delayed neutrons that allow for control of the nuclear reaction.

In LWRs, the fast neutrons must be thermalized (slowed down) from 1-2 MeV to ~0.025 eV, which is what hydrogen in water does quite well.

Yes, I'm not contradicting this. I'm confirming all that. But you will agree with me that in total, in the reactor, about 200 MeV is released for a single fission, right ? It doesn't matter what particle carries away what energy, in total, about 200 MeV is released and finally converted to thermal power of the reactor, right ?

And of this single fission, initially, 2.5 neutrons were available, and, as you say, one is used to continue the chain reaction, so 1.5 neutrons go "elsewhere". It are THESE neutrons which are available for absorption, right ?
(the small amount of delayed neutrons and so on is not seriously going to alter the balance)

One fission neutron must survive to cause another fission. The remaining neutrons are absorbed by the coolant (H + n => D or D + n => T, but that's a very small fraction), by the structural material (steels and nickel alloys, and very little in Zr-based cladding), and by the fuel (U238 + n => U239 => Np239 => Pu239, or Np239 + n => Np240 => Pu240, and a host of other transuranic isotopes). In LWRs, about half the fissions in high burnup fuel are actually in Pu239 rather than the U235.

All this is correct but irrelevant to the issue...

What counts is that to have X neutrons available for absorption in a nuclear reactor, one needs to fission (X / 1.5) fuel atoms, be it U-235 or Pu-239 or even something else.

And each fission will liberate about 200 MeV of finally thermal energy.

If you want to have 1.5 10^25 neutrons absorbed, you will have to fission 10^25 atoms of U-235 or Pu-239 or whatever and probably more, because here we suppose ideally that ALL neutrons that are liberated by fission and are not fissioning another U-235 atom, are going to be usefully absorbed.

If you work with Li-6, with 1.5 10^25 neutrons absorbed, you can have 1.5 10^25 tritium atoms. So in order to produce 1.5 10^25 tritium atoms, you will have had to fission 10^25 fuel atoms.

10^25 fissions of fuel atoms will have liberated a total amount of energy equal to 10^25 times 200 MeV (or 3205 TJ). If this is done in a year's time, this comes down to 101 MW.

So in order to run a 15 MW thermal fusion reactor, one needs a 101 MW fission reactor next to it. That's ridiculous. It is not useless, but it is ridiculous to develop technology for that. Just make the fission plant somewhat bigger and you don't need no stinkin' fusion :smile:

Now, with those 1.5 10^25 tritium atoms, we can do 1.5 10^25 fusions, and hence liberate in a fusion reactor, 1.5 10^25 x 20 MeV which amounts to 15.2 MW if we use this fuel up during a year.

So a 102 MW fission reactor can provide in very ideal situations,enough tritium fuel for a 15.2 MW fusion reactor to run continuously, if the reactor isn't providing any tritium breeding by itself.

In (d+t) fusion, the neutron actually carries away a substantial portion of the energy (14.1 MeV of 17.6 MeV) and there is one neutron that must go somewhere - out of the fusion reactor plasma into the first wall or blanket surrounding the plasma. Ideally that neutron is captured by Li to produce more T for fusion, or it could be used for a fission reaction in a so-called fusion-fission hybrid.

Yes, but all that isn't doing anything to the issue.

d+t fusion reaction is used because it is the easiest with which to produce energy. Ideally d+d fusion would be used, if perfected, because D is much more plentiful than T, and it's not radioactive. But d+d reaction has a lower cross-section at a given temperature, and to achieve the same reaction rate, d+d plasmas must operate at a higher temperature (and pressure) than d+t plasmas.

Yes, so for the moment we would already be very happy by having a reactor run on D+T.

The point is that if with D+T fusion, one doesn't achieve self-sufficiency with a breeding blanket, it becomes, as a commercial power production mechanism, a ridiculous technique, and that was the OP's point. I think he's right.
 
  • #33
vanesch said:
Yes, I'm not contradicting this. I'm confirming all that. But you will agree with me that in total, in the reactor, about 200 MeV is released for a single fission, right ? It doesn't matter what particle carries away what energy, in total, about 200 MeV is released and finally converted to thermal power of the reactor, right ?
I was only objecting to ""huge amount of fission energy" / "huge amount of neutrons wasted" = 200 MeV / 1.5 = 130 MeV"

The point is that if with D+T fusion, one doesn't achieve self-sufficiency with a breeding blanket, it becomes, as a commercial power production mechanism, a ridiculous technique, and that was the OP's point. I think he's right.
Unless one adds in Be which can produce (n,2n) reactions, or a fission blanket. But that adds fissions to the system, which is at odds for purusing fusion as a replacement to fission.

In addition, Li, has become more critical to rechargeable batteries, and the prospect of consuming Li as a fuel may not work because the demand for other uses will increase it's value/cost.

Finally, to claim ITER is useless because we can't produce tritium in large quantities simply ignores the fact that d+d is the preferred reaction, but d+t is easier to accomplish. If d+t fusion can be successfully demonstrated, the d+d could work as well (maybe). Ideally, fusion would be based on aneutronic reactions - which isn't the case with the easiest reaction, and not even with d+d, in which about half the fusions produce neutrons.
 
  • #34
Astronuc said:
I was only objecting to ""huge amount of fission energy" / "huge amount of neutrons wasted" = 200 MeV / 1.5 = 130 MeV"

Nevertheless, it is correct :wink:

(by wasted, I meant "in a normal reactor, the amount of neutrons that are indeed "wasted" in the sense of absorbed in control elements, fuel but no fission (*), structure... and so are eventually, maximally, available for T production if we don't "waste" them).

The number of neutrons produced that can be wasted, say, to make T, will need at least a dissipation of 130 MeV of energy per neutron each.

Unless one adds in Be which can produce (n,2n) reactions, or a fission blanket. But that adds fissions to the system, which is at odds for purusing fusion as a replacement to fission.

Yes, that's the point. Actually, until I read the OP's post, I never realized how critical this blanket was. It will be quite difficult to achieve self-sufficiency, because of course you cannot capture the neutrons in 4 pi without any structural capture and loss. So you need a neutron multiplier.

Non-fission neutron multiplication isn't easy (apart from spallation). If it were, people would use it to make thermal breeders with uranium.
In addition, Li, has become more critical to rechargeable batteries, and the prospect of consuming Li as a fuel may not work because the demand for other uses will increase it's value/cost.

Indeed.

Finally, to claim ITER is useless because we can't produce tritium in large quantities simply ignores the fact that d+d is the preferred reaction, but d+t is easier to accomplish. If d+t fusion can be successfully demonstrated, the d+d could work as well (maybe).

Yes, of course, but when you look at the difficulties people have to realize self-sustained, energetically useful let alone commercially competitive energy from D + T (that's the hope in the second half of this century) even granted tritium provision, D + D is for the 22nd century at best.

I'm not saying ITER is useless, but the tritium bottleneck is yet another difficulty as compared to the rosey pictures of "soon, clean energy here", no ?

Fission seems the way to go for a long long time still. Research is never useless, you always learn something. ITER will be financed by 3 times a big mistake of a trader in a front office. That's reasonable... :smile:
 
Last edited:
  • #35
vanesch said:
The number of neutrons produced that can be wasted, say, to make T, will need at least a dissipation of 130 MeV of energy per neutron each.
But this is not correct. One neutron causes a fission which produces 200 MeV. The additional neutrons would carry away only 2-4 MeV. They could be absorbed in special assemblies to produce tritium, which is produced in the coolant through the (n, alpha) reaction with Li anyway. There is no 130 MeV being carried away by the extra neutrons from fission. Anyway, a substantial fraction of the extra neutrons are absorbed the fuel (U-238) which is converted eventually to fissile Pu239 and (Pu-240, Pu-241), Am241, Cm244, and other TRUs.

BTW - thorium (Th-232) with U-233 is the basis of a thermal breeder reactor.
 
  • #36
Astronuc said:
But this is not correct. One neutron causes a fission which produces 200 MeV. The additional neutrons would carry away only 2-4 MeV. They could be absorbed in special assemblies to produce tritium, which is produced in the coolant through the (n, alpha) reaction with Li anyway. There is no 130 MeV being carried away by the extra neutrons from fission.

Astro, I'm not saying (nor was the OP saying) that this energy is carried away by the neutrons, that is not the point.

The point is that you need to liberate 130 MeV in a fission reactor in order for you to have an "absorbable" neutron. This neutron is not carrying this energy, nor is this energy "lost" but is has been irreversibly converted into heat. Heat which can be used to make steam or whatever, to make electricity, to make hydrogen, to desalinate seawater, or to boil eggs, whatever you use a reactor for.

The point is not that there would be a kind of "loss of energy" or that it would require more energy to make the tritium than to use it. No. The point is that 130 MeV of useful fission energy has to be liberated in a fission reactor per available neutron, so (without significant neutron multiplication) per produced tritium atom. If you want to produce 1.5 10^25 neutrons over a year's time to do something with like making tritium, there's no way but to have a fission facility that has liberated a power of about 100 MW. That 100 MW thermal can be used to make electricity or boil eggs or whatever, so it can be put to good usage, but it is power produced by a fission facility. If we take those 1.5 10^25 neutrons and we let them turn Li-6 into tritium, then we have made 1.5 10^25 tritium atoms.

So consider that making 1.5 10^25 tritium atoms required you to run a nuclear facility that liberated useful power of 100 MW.

Now, with this tritium fuel, if you go to a fusion facility running D + T, you can provide for 15 MW (if it doesn't have a blanket).

So the ridiculous part in this case is that you need to run a 100 MW reactor (producing, say, 35 MW of electricity), to fuel a fusion reactor of 15 MW, producing 5 MW of electricity.

So if you were planning to need 40 MW of electricity, you would have to build a fission reactor providing you with 35 MW of electricity and tritium fuel, and a fusion reactor of 5 MW. This is not a problem, but it is ridiculous to spend 100 years of research to achieve THAT.

Now, even with an 80% efficient blanket, you would still need to have a nuclear facility providing for 7 MW of electrical power to have a fusion reactor making 5 MW of electrical power. That too, is ridiculous as an achievement.

Anyway, a substantial fraction of the extra neutrons are absorbed the fuel (U-238) which is converted eventually to fissile Pu239 and (Pu-240, Pu-241), Am241, Cm244, and other TRUs.

Indeed, which makes it WORSE, because, for the same amount of available neutrons to make tritium you need even to liberate MORE fission power, and you make more fission fuel... so your balance tips even more over to the fission power side...

BTW - thorium (Th-232) with U-233 is the basis of a thermal breeder reactor.

I know, I should have said U-235.
 
  • #37
The point is that you need to liberate 130 MeV in a fission reactor in order for you to have an "absorbable" neutron.
Again I have to object to this statement. It is the fission process - primarily the kinetic energy of the fission products - that is responsible for the energy. The number of neutrons released is irrelevant - except that having an excess of neutrons allows a sustainable fission process. As long as one neutron is released in the fission process and is subsequently absorbed to cause another fission, the process is sustainable without adding some other type of neutron source.

Now the lesser the number of neutrons released in fission, the more difficult it becomes to design an economic and feasible system to produce energy.

In an LWR, there is an excess of neutrons and many are either absorbed by boric acid in the PWR coolant or in B-10 or Hf in control blades in a BWR. A by-product of n-absorption in boron is Li which undergoes an n-alpha reaction and produces tritium. Tritium is produced in normal operation whether it is subsequently used in fusion or not. However, I don't believe that T is produced in huge quantities.

In fusion, the goal is to use d+d, and if that is successful, then tritium is not necessary.
 
  • #38
Astro, we are talking completely next to each other, I don't see why, honestly.
I see completely your point, but it is NOT what I am talking about. I'm also not "attacking" fusion. I'm saying that it is absolutely necessary for D+T fusion to have a self-sufficient blanket, or this form of energy promised is totally ridiculous, which is what I understood from the OP, EXCEPT that the OP also had some doubts about the realistic feasibility about making a genuine self-sufficient blanket in a genuine power plant. And yes, in a much more remote future, we might consider D+D fusion, or even H+H fusion, but nobody is talking NOW in achieving technical break-even in D+D.
So this IS an important issue.

But let us come to our point of apparent disagreement: the fact that a fission plant needs to release about 130 MeV of fission energy per produced neutron that is "available" - under the hypothesis of no significant neutron multiplication outside of the fission process itself.

As I don't seem to be able to make you see the point I'm making, I will try to make YOU make the point. I'm certainly not trying to be condescending, I will just try to ask you some questions in order for you to see where I'm coming from, ok ?

Astronuc said:
Again I have to object to this statement. It is the fission process - primarily the kinetic energy of the fission products - that is responsible for the energy.

Yes, but you cannot AVOID having fission products when you want to have fission, do you ? You cannot AVOID a fission reaction to liberate about 200 MeV and to heat its environment by said amount of energy, right ?

Now, as THE ONLY SOURCE of neutrons is fissions (you agree with that, up to the small fraction of retarded neutrons ?) and on average a fission will liberate about 2.5 neutrons and you will need 1 of those to sustain the chain reaction, HOW MUCH ENERGY do you think you will need to release in a fission plant to liberate 1.5 10^25 neutrons which you want to absorb in Li-6 ?

How much fissions are going to be necessary to liberate 1.5 10^25 neutrons and sustain a chain reaction ?

Do you think this will be much less than 10^25 fission reactions ? I don't see how you could POSSIBLY extract 1.5 10^25 neutrons out of a fission system without having AT LEAST 10^25 fission reactions happening, simply because each fission an sich has liberated 2.5 neutrons, 10^25 fissions have hence liberated 2.5 10^25 neutrons, but in order to cause those 10^25 fissions, I have "eaten" 10^25 neutrons which are hence not available for any other thing, so my neutron balance of potentially "free" neutrons with 10^25 fissions is just 1.5 10^25 neutrons and no more. Do you disagree that you need at least 10^25 fission reactions in order to be able to provide for 1.5 10^25 neutrons to special-purpose absorption reactions ?

If yes, I would like to be enlightened about the detailed neutron balance.

Now, if we need to have 10^25 fission reactions, I do not see how we can avoid liberating 200 MeV x 10^25 in total fission energy. This is the heat that will be released in the process if we have 10^25 fissions. Do you think otherwise ? Do you think you can have 10^25 fission reactions, and yet not liberate 200 MeV x 10^25 as thermal energy in the reactor ?

If this energy is released during one year of operation, do you agree we come close to 100 MW (thermal) ?

Now, with 1.5 10^25 neutrons available for the Li-6 + n -> T + alpha reaction, do you agree that we can produce at most 1.5 10^25 tritium atoms ?

Now, with 1.5 10^25 tritium atoms, how many fusion reactions can we do in a D + T fusion reactor ?

Given that each D + T reaction liberates about 17 MeV, how much energy is this in total ? And if you release this over one year (the time the other reactor needed to "fill up the stock" of tritium), do you agree that the power will be around something like 15 MW (namely 17 MeV x 1.5 10^25 / the number of seconds in a year) ?

Or not ?

So what is wrong with me saying that:

1) to have 1 neutron "free to be absorbed" from a reactor, you need to dissipate 130 MeV of fission energy, or more, in the reactor ?

2) the fusion power plant (without blanket) will produce way less power burning the produced tritium, than the reactor that produced the tritium in the first place, will have produced ?

The number of neutrons released is irrelevant - except that having an excess of neutrons allows a sustainable fission process. As long as one neutron is released in the fission process and is subsequently absorbed to cause another fission, the process is sustainable without adding some other type of neutron source.

Yes, I'm not saying that. I'm saying that of the 2.5 neutrons produced in the fission process, you will use 1 (one) in the chain reaction which is hence not available for something else, such as a neutron beam, or

In an LWR, there is an excess of neutrons and many are either absorbed by boric acid in the PWR coolant or in B-10 or Hf in control blades in a BWR. A by-product of n-absorption in boron is Li which undergoes an n-alpha reaction and produces tritium. Tritium is produced in normal operation whether it is subsequently used in fusion or not. However, I don't believe that T is produced in huge quantities.

I agree with all that, but we were talking about how much NUCLEAR FISSION POWER we need to liberate in order to fuel a D + T fusion reactor that will itself consume a certain amount of tritium if there is no blanket, or if there is a lossy blanket.

In fusion, the goal is to use d+d, and if that is successful, then tritium is not necessary.

Of course. The other possibility is to have a fully self-sustained blanket in a D+T reactor. This will need neutron-multiplication to cover for losses of all kinds. Both are serious challenges.
 
Last edited:
  • #39
vanesch said:
Well, there as of now already another serious problem that indicates this tritium difficulty: there is now a world-wide scarcity of He-3. He-3 is used in cryogenics, but also in neutron detection, and there's a world-wide problem with it.
Well, He-3 is the decay product of tritium. If you have enough tritium, there isn't any He-3 supply problem. So the fact that there is now such a He-3 problem indicates that tritium is a very scarce resource.

The cost of manufacturing tritium is on the order of $100,000 per gram. Despite the shortage, commercial He-3 still costs $15,000 per gram. In essence, tritium alone costs much more than the electric energy that we could extract from it via d-t process if we had a working fusion reactor.
 
  • #40
I understand vanesch's point now. Basically, he is saying that for whatever neutron flux you are using to produce tritium in a fission reactor, that neutron flux would be producing on the order of 10 times as much power from fissions compared to what you would eventually get fusing the tritium, due to the fact that one neutron releases 200 MeV from fission but only ~20 MeV from fusion.

However the reason why this is inaccurate is because when Li-7 is used to breed tritium, it produces a neutron of lower energy. So one fission neutron could end up producing multiple tritium atoms (the number of which I am not sure of, I'd have to model it and see).
 
  • #41
hamster143 said:
The cost of manufacturing tritium is on the order of $100,000 per gram. Despite the shortage, commercial He-3 still costs $15,000 per gram. In essence, tritium alone costs much more than the electric energy that we could extract from it via d-t process if we had a working fusion reactor.

The reason why the cost is so high is because the production is very low. The production is low because the is only limited and specialized demand. The price would go way down if it was mass-produced for commercial purposes.
 
  • #42
So if Tritium is currently in short supply, what would a viable alternative be for a fusion fuel?

I think one advantage of D-T mixture is the energy gain and activation energy compared to something like a D-D reaction. For a D-D reaction, when efficiencies of a power generation cycle are taken into account total energy gain nears break even and couldn't be as useful from a large energy production standpoint.
 
  • #43
I think Eric Drexler summed up the issue in his blog post.

Why fusion won’t provide power (at a reasonable cost)
http://metamodern.com/2010/01/20/why-fusion-won%E2%80%99t-provide-power/"
 
Last edited by a moderator:
  • #44
joelupchurch said:
I think Eric Drexler summed up the issue in his blog post.

Why fusion won’t provide power (at a reasonable cost)
http://metamodern.com/2010/01/20/why-fusion-won%E2%80%99t-provide-power/"

Complaining about the capital costs for commercial fusion power plants is a bit premature at this point, don't you think?

Try reading the report he cites. I couldn't find any hard numbers, just a bunch of vague fluff about technology readiness levels. Then again I couldn't manage to get more than a few pages into it, I'm not good at reading political-speak mumbo jumbo.
 
Last edited by a moderator:
  • #45
hamster143 said:
The cost of manufacturing tritium is on the order of $100,000 per gram.
The relevant cost of T would be after Beryllium blanketed fusion reactors exist, not before.

Despite the shortage, commercial He-3 still costs $15,000 per gram.
Ok, but there's no viable He-3 fusion reactor on the table, even if you had it. He-3 He-3 cross section is, what, ~50X smaller than D-T, requires an impossibly hotter temperature (in a thermalized containment reactor) and produces 1/3 less energy per go.
In essence, tritium alone costs much more than the electric energy that we could extract from it via d-t process if we had a working fusion reactor.
Let's see. A gram of T undergoing 100% fusion w/ D, at 17 MeV per fusion would release ~34x10^23MeV, or 544 gigajoules, of which a ~third could be converted to electricity, resulting in ~50 MW-hrs. At the current retail price of $100/MW-hr that's about $5000 of sellable energy. So indeed, if D-T fusion is to work commercially, I suppose the cost of T is coming down. :wink:
 
  • #46
QuantumPion said:
Complaining about the capital costs for commercial fusion power plants is a bit premature at this point, don't you think?
...jumping in: No, it's not premature. Otherwise we could have a go at building spacecraft powered by anti-matter, which is produced all the time in accelerators but not (nearly) for a reasonable capital cost. ITER is not purely a research facility. In addition to research, it's there to prove the concepts required for commercial fusion power.
 
  • #47
QuantumPion said:
I understand vanesch's point now. Basically, he is saying that for whatever neutron flux you are using to produce tritium in a fission reactor, that neutron flux would be producing on the order of 10 times as much power from fissions compared to what you would eventually get fusing the tritium, due to the fact that one neutron releases 200 MeV from fission but only ~20 MeV from fusion.

Yes, this is what I also understood from the OP.

However the reason why this is inaccurate is because when Li-7 is used to breed tritium, it produces a neutron of lower energy. So one fission neutron could end up producing multiple tritium atoms (the number of which I am not sure of, I'd have to model it and see).

I'm no expert, but it is going to be tricky. People use Be and Pb as "neutron multipliers", and I myself gave also the example of Li-7. But it is tricky, and all structural material will make you loose some neutrons, non-full angular coverage too. And this is what I realized (I didn't think of this before reading the OP): it is ABSOLUTELY ESSENTIAL to show that this problem can be solved or (DT) fusion is ridiculous, as it stands.
Everybody is concentrating on the Q factor and so on, and how much Q should be in order for a commercial plant to be viable, but the blanket is just as important. Without a self-sufficient blanket, it is a no-go.
 
  • #48
hamster143 said:
Despite the shortage, commercial He-3 still costs $15,000 per gram.

If we could buy He-3 for that price, we would be happy ! We got quotes of more than 2000 Euro for one litre (1 atmosphere).

EDIT: silly me, that's comparable :redface:
 
Last edited:
  • #49
Sufficient T generation is only one of the absolutely critical problems that need to be solved for successful commercial fusion. DT fusion also requires a first wall that can withstand the fast neutron flux (every single atom in the first wall will be displaced 30 times per year [1]) for sufficient time as is required of an economic reactor. The design of a fission reactor can play games with fuel rod count and diameter to manage heat flux. No such design fix is possible with the first wall of a Tokamak.

[1] http://www.askmar.com/Robert%20Bussard/The%20Trouble%20With%20Fusion.pdf"
 
Last edited by a moderator:
  • #50
vanesch said:
...natural abundance of tritium... is of the order of 10-17..., inexploitable for commercial energy production.

As I understand the OP, his point is the following: given that tritium has to come from an artificial source, usually neutron bombardment of Lithium, there is a bookkeeping problem:

a) one fusion reaction consumes one tritium nucleus, and produces one neutron.

b) if we have only a tritium production of one tritium atom per neutron, this is never going to achieve auto-refueling.

c) if a serious fraction of the to be burned tritium has to come from another source, nuclear fission, then we have the following problem:

c1) [Fusion produces less electricity than the fission reactors that supply tritium to it, hence not interesting]

[Please refer to the post by Vanesh, I shorten it in the present quote]

...the blanket needs to achieve self-sufficiency if this power source doesn't want to be ridiculous, because needing a bigger fission power station next to it than it can deliver itself.
Yes, this is what I tried to explain. Many thanks to Vanesh to have made it clearer than I did.

It means that, to substantially replace fission, fusion has to regenerate tritium by itself. This is, to my eyes, a harder constraint than technology issues (which receive solutions despite they first look like impossibilities): it is an issue detectable in neutron bookkeeping - the kind of things that shall work with big margin before you include technology meanness, but here the margin is negative.

Other reactions than D-T would solve that but are out of reach by tokamaks, even on the timescale of fusion research.

Though this issue of tritium regeneration hasn't been widely publicized (...sorry folks...) it is known by tokamak researchers. I had raised this objection, as did specialists with more knowledge and influence. This is why ITER now includes a programme to develop and test blankets with neutron multipliers, the only way to regenerate tritium.

Many papers deal with this presently; unless I missed something,
- Beryllium is abandoned as it couldn't achieve regeneration in simulations;
- Lithium-7 is abandoned, blankets would even be enriched in Lithium-6;
- Lead seems to be the only hopeful multiplier through (n, 2n) reactions exploiting the 14MeV neutron from D-T fusion.

Nearly all papers consider a lead-lithium eutectic for tritium regeneration. This eutectic would serve as heat removal fluid as well, because both need to cover the chamber completely.

A simple description of the experimental blankets for ITER:
http://www.nuklearforum.ch/_upl/files/Pr__sentation_Poitevin.pdf

Under simulation conditions, the regeneration factor could reach up to 1.15 - the kind of figure I hate to see when I must guarantee >1 but other design constraints haven't been included yet... And a tokamak brings some additional design constraints, doesn't it?
 
Last edited by a moderator:
  • #51
The NIF LIFE concept proposes to create it's own Tritium using a Lithium-enriched liquid salt blanket:

LLNL.gov said:
Q: Tritium is rare and very expensive to produce. How would a fusion power plant get the tritium it needs to sustain continuous fusion reactions?

A: It's true that tritium exists only in small quantities in nature, so a fusion energy power plant would need to create its own tritium fuel. The neutrons generated in the fusion reaction will be absorbed within a liquid salt blanket surrounding the fusion chamber to create a hot fluid that will turn a turbine to generate electricity. The salt will contain lithium, which will react with the fusion neutrons to produce helium and tritium. Due to neutron multiplication reactions, it is possible to make more than one triton (tritium nucleus) for each one consumed in fusion reactions, creating a net positive generation of tritium. This tritium is then sent to the target factory to be used to produce new targets.
https://lasers.llnl.gov/education/faqs.php#tritium
 
Last edited by a moderator:
  • #52
So Enthalpy, you don't agree that funding the ITER, which could possibly lead to fusion using not only D-T, but also D-D or other fuels, is a worth it?
 
  • #53
Drakkith said:
So Enthalpy, you don't agree that funding the ITER, which could possibly lead to fusion using not only D-T, but also D-D or other fuels, is a worth it?
He is right saying:
Other reactions than D-T would solve that but are out of reach by tokamaks, even on the timescale of fusion research.
 
  • #54
vanesch said:
Yes, that's the point. Actually, until I read the OP's post, I never realized how critical this blanket was. It will be quite difficult to achieve self-sufficiency, because of course you cannot capture the neutrons in 4 pi without any structural capture and loss. So you need a neutron multiplier.
D-T reaction needs neutron multiplying coefficient 1.15-1.25 or to produce Tritium in existing fission reactors. And no any other way.
vanesch said:
Non-fission neutron multiplication isn't easy (apart from spallation). If it were, people would use it to make thermal breeders with uranium
That’s not so. If considering that realization of fusion means less danger wastes.
vanesch said:
Yes, of course, but when you look at the difficulties people have to realize self-sustained, energetically useful let alone commercially competitive energy from D + T (that's the hope in the second half of this century) even granted tritium provision, D + D is for the 22nd century at best.
I doubt in commercial feasibility of D-D reaction. More interesting is to build D-T reactors with bigger than 1.15-1.25 Tritium breeding coefficient, then to wait till some tritium will decay to He3 and then to build aneutronic D-He3 reactors.
vanesch said:
I'm not saying ITER is useless, but the tritium bottleneck is yet another difficulty as compared to the rosey pictures of "soon, clean energy here", no ?
ITER useful for accumulation of technology know how – magnets, vacuum, blanket, first wall, neutral injection, etc. Someone should do these jobs.
But I doubt that TOKAMAK as such ever will be able to generate net power. As if you see this link: http://iter.rma.ac.be/Stufftodownload/Texts/BurnCriteria.pdf where is taken into account real energy conversion cycles’ efficiency is calculated that required confinement time should have an order of 560 s. (page 8 after formula (57) ).
But there is not any bottleneck with Tritium. And I am disagreeing with statement mentioned here as Lithium is a rare element.
 
Last edited:
  • #55
Joseph Chikva said:
More interesting is to build D-T reactors with bigger than 1.15-1.25 Tritium breeding coefficient, then to wait till some tritium will decay to He3 and then to build aneutronic D-He3 reactors.

While Lithium is certainly abundant enough, reaching the mandatory Tritium breeding coefficient looks impossible, given that present computations don't integrate many difficult constraints.

Any designer likes to start with a margin of 10 at scratch, if he's to keep >1 as his design advances. For uranium chain reaction, they had 2.4 neutrons to keep 1 and this needed a big effort to develop materials, forms... Starting from 1.15 is disheartening - to my eyes it's impossible.

3He-D is even more difficult than D-D because of the third repelling proton.

Then, you have the radioactive pollution by the regeneration blankets, which promises to be as bad as uranium fission.

Developing Tokamaks for the sake of pure science may be fun, but not if we need energy right now, not if we see a probable impossibility, not if it takes for decades thousands of brilliant people who could solve instead more productive challenges, like electricity storage.
 
  • #56
Enthalpy said:
While Lithium is certainly abundant enough, reaching the mandatory Tritium breeding coefficient looks impossible, given that present computations don't integrate many difficult constraints.

Any designer likes to start with a margin of 10 at scratch, if he's to keep >1 as his design advances. For uranium chain reaction, they had 2.4 neutrons to keep 1 and this needed a big effort to develop materials, forms... Starting from 1.15 is disheartening - to my eyes it's impossible.

3He-D is even more difficult than D-D because of the third repelling proton.

Then, you have the radioactive pollution by the regeneration blankets, which promises to be as bad as uranium fission.

Developing Tokamaks for the sake of pure science may be fun, but not if we need energy right now, not if we see a probable impossibility, not if it takes for decades thousands of brilliant people who could solve instead more productive challenges, like electricity storage.
To my eyes it's impossible to produce net power using TOKAMAKs and D-T reaction.
But I am talking not about the viability of certain fusion Method. But here I am only talk about fuel cycles. And tritium breeding with any needed breeding coefficient is less complicated challenge than the breakeven achievement.
I never heard about blankets in which a few years loading of breading materials (Li6+neutrons multiplier) should be placed. But only current quantities. So, that will not be as dangerous as fission in case of accident.
not if it takes for decades thousands of brilliant people who could solve instead more productive challenges, like electricity storage.
You are wrong. Demand on electricity growths. And only growth of generation would solve a problem. Or we would not need any electricity storage.
 
Last edited:
  • #57
Joseph Chikva said:
You are wrong. Demand on electricity growths. ...
Not lately in the US:

http://www.eia.gov/totalenergy/data/annual/txt/ptb0802a.html"
2005 4,055
2006 4,064
2007 4,156
2008 4,119
2009 3,953
2010 4,120

Similarly US energy intensity, that is energy per $ of economic production has been and continues to decline.
http://www.eia.gov/emeu/25opec/sld022.htm
 
Last edited by a moderator:
  • #58
mheslep said:
Not lately in the US:

http://www.eia.gov/totalenergy/data/annual/txt/ptb0802a.html"
2005 4,055
2006 4,064
2007 4,156
2008 4,119
2009 3,953
2010 4,120

Similarly US energy intensity, that is energy per $ of economic production has been and continues to decline.
http://www.eia.gov/emeu/25opec/sld022.htm
World production? China, India, Brazil, some developing countries? Total energy generation?
That is not engineering but more an economical and political issue.
If cheap electricity in US would increase the competitivnes of US economics and generation demand will growth as well. Certainly if USA is not going to concede its economical leadership. The gap between China and USA is smaller and smaller.
And renewables are not cheap.
 
Last edited by a moderator:
  • #59
mheslep said:
Not lately in the US:

Umm, I don't think that looking 5 years in the past is sufficient to say that future energy demand won't/isn't increasing. A quick look on that table shows that up until 2008 there was a continual increase. And if your view of "lately" is only 3 years ago, then I think you should expand your view on the situation.
 
  • #60
Any chance you can confine the fusion bashing to ONE thread encephalaphy?
 

Similar threads

  • · Replies 9 ·
Replies
9
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 5 ·
Replies
5
Views
4K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 25 ·
Replies
25
Views
8K
  • · Replies 46 ·
2
Replies
46
Views
15K
  • · Replies 23 ·
Replies
23
Views
9K
Replies
4
Views
3K
Replies
4
Views
4K
  • · Replies 13 ·
Replies
13
Views
6K