Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Current State of Nuclear Fusion Power

  1. Nov 28, 2018 #1

    hotvette

    User Avatar
    Homework Helper

    When I was an undergrad doing research in a university lab, the director of the lab also consulted at a Nuclear Fusion company. I remember like it was yesterday him making a statement that commercial Nuclear Fusion was going to be a reality within 10 years. That was 1977.

    What happened? Lack of government funding? The technical problems were far more difficult to overcome than what people thought? Both? I have to admit I haven't kept up on the state of the technology, so I'm hoping someone would be willing to comment.
     
  2. jcsd
  3. Nov 28, 2018 #2

    Drakkith

    User Avatar
    Staff Emeritus
    Science Advisor

    Both as far as I understand. It turns out that achieving a net power gain with fusion is extremely difficult. Radiation losses, lost particles, plasma turbulence, and many other have to be solved or mitigated for fusion to become a power source, and none of these things are simple or easy to solve.

    I can't say much on the funding side of things, but it was my impression that we would be much further along if we had seriously poured funds into research and development of fusion. Of course, that's easy to say, but it's hard to sell that to the taxpayers and politicians.
     
  4. Nov 28, 2018 #3

    phyzguy

    User Avatar
    Science Advisor

    It's worse than that. In 1951 Lyman Spitzer testified to the US Senate and recommended funding fusion research as a means to clean energy. You can always argue that there should be more funding, but many, many billions of dollars have been spent. It has just been much more difficult than people thought. The joke in the field is that "fusion is the energy of the future ... and always will be".
     
  5. Nov 28, 2018 #4

    PeterDonis

    Staff: Mentor

    I was told the same thing (commercial fusion within 10 years) in the mid-1980s. :wink: At that time I was at MIT doing my bachelor's and master's thesis research at the Plasma Fusion Center there. The Alcator C-Mod tokamak was on the drawing board and everybody thought it was going to be a huge step towards ignition.

    I'm not sure it's been a lot more difficult that the plasma physicists thought. I think they just overestimated their ability to continue to get increased funding against competing priorities. I don't have a link to the data handy right now, but IIRC in the mid to late 1980s they were asking for something like $10 billion a year, and were actually getting less than 10 percent of that. So I think it's not so much that they're encountering issues they didn't expect, as that they're encountering the sorts of issues they expected, but at a pace 10 times or more slower than they had hoped.

    Fusion is not the only area of science to have this funding issue. Particle physicists similarly failed to get funding for the Superconducting Super Collider in the early 1990s. If that had been fully funded, we might have had data similar to what we're getting now from the LHC, 10 to 15 years sooner.
     
  6. Nov 28, 2018 #5

    phyzguy

    User Avatar
    Science Advisor

    But clearly funding is not the whole story. Look at the National Ignition Facility. It was funded, and has cost at least $3.5 billion. People were pretty confident that it would achieve ignition, or they wouldn't have given it that name. And yet it hasn't resulted in ignition, at least not yet. And from what I've heard from people working there, they no longer expect to achieve ignition. The plasma just keeps finding imaginative ways to escape.
     
  7. Nov 28, 2018 #6

    PeterDonis

    Staff: Mentor

    Some people were pretty confident, yes. But, at least among the people I worked with at MIT in the mid-1980s, a lot were not at all confident. Some of that was probably the normal rivalry between different projects competing for the same funding stream--the MIT people were all working on tokamak fusion. But some of it, at least from what I remember, was based on actual analysis of the proposed configuration and what it could reasonably be expected to accomplish.

    Also, the $3.5 billion you mention was spent over a period of about 10 years. Granted, at least part of the delay (and the increase in budget--it was originally supposed to cost not much over $1 billion) was due to mismanagement (which is unfortunately common with government-funded projects). But even without mismanagement, a government project of this sort funded over multiple years means that the scientists in charge of the project have to request funding every year; the government is not going to commit to all of the funding up front. This fact all by itself means the people running the project are operating with an uncertainty that makes it much harder to make progress.
     
  8. Nov 29, 2018 #7

    russ_watters

    User Avatar

    Staff: Mentor

    I think if they were making projections based on the prediction that their funding would increase by a factor of 10, that's a pretty wild fantasy. Heck, even at that, a single project takes at least 10 years to complete no matter how much money you throw at it, so unless everyone thought the next project would be the commercially viable reactor prototype, the prediction doesn't make sense. And 40 years later, after spending something like $30B, we are further away from success than ever!

    So these predictions didn't match their funding levels or the actual projects they were working on. So where did these predictions come from? Were they totally flippant? It really doesn't feel to me like these predictions had any basis in reality and I have very little confidence that commercial fusion will be achieved in my lifetime (expected 50 more years).

    Personally, though, I think it's the science. I was only a kid in the 80s, but I remember talk of fusion being right around the corner. It wasn't because scientists thought their funding would explode, it was because they thought they had the science figured out and the next reactor would be a commercial prototype, because the one they were working on "now" was going to break even, and after that it would be time for commercialization. But it didn't and still hasn't happened. They were wrong/overconfident about the state of the science.

    I've totally lost interest in considering fusion in future energy projections. If it happens, great, but I no longer consider it likely or even necessary for our future. At least today's more realistic timelines make it easier to discard fusion from those plans. ITER is planned to be operational in about 20 years. DEMO is shown on Wikipedia to be designed and constructed in the 2020s to 2033, but that probably doesn't include delays in ITER. A more realistic timeline would put the program in the 2035-2050 range. If it succeeds, the first commercial reactor could be operating by 2065. If all goes well - which it never has. I'll be 90 if I'm still alive. So no need to consider fusion as part of my lifetime's energy future.
     
    Last edited: Nov 29, 2018
  9. Nov 29, 2018 #8

    PeterDonis

    Staff: Mentor

    I think they believed that they could convince the bureaucrats in Washington that fusion power was a big enough potential win to justify the level of funding they were requesting. IIRC some advocates were making analogies with the Manhattan Project and Project Apollo. But they failed to convince the bureaucrats.

    I suspect that part of the reason for the slow progress, once it became clear that funding at the levels that were requested was not going to happen, was that a lot of talented people left the field and went to work on something else. So the progress that's being made now is not being made by the same people or the same level of talent that would have been working on it if the funding had materialized.
     
  10. Nov 29, 2018 #9

    russ_watters

    User Avatar

    Staff: Mentor

    But haven't we spent more money than on the Manhattan project, producing dozens of reactors that failed to achieve their goals? (Breakeven)

    [sorry, but I was running-editing the previous post, with related content]
     
  11. Nov 29, 2018 #10

    PeterDonis

    Staff: Mentor

    Say your figure of approximately $30 billion spent on fusion to date (in current dollars) is correct.

    According to Wikipedia [1], the total cost of the Manhattan Project was $20.5 billion in 2017 dollars, and the cost of Project Apollo [2] was around $120 billion in 2016 dollars.

    [1] https://en.wikipedia.org/wiki/Manhattan_Project#Cost
    [2] https://en.wikipedia.org/wiki/Apollo_program#Costs

    So the total amount spent on fusion to date is more than the Manhattan Project, but significantly less than Project Apollo. But there's a big difference: the Manhattan Project spent all of that money in 5 years; Project Apollo spent it all in about 13-14 years (if you count the whole sequence from Mercury to Gemini to Apollo, from 1959 to 1973). Fusion research has taken 6 decades and counting (starting from the 1950s) to spend its money. Concentration of effort and focus matters. So does the certainty of funding: nobody on the Manhattan Project had to worry about whether Congress would continue the program. Nor did anyone on Apollo up through about 1970. Fusion has never had the same concentration of effort or certainty of funding.
     
  12. Nov 29, 2018 #11
    maybe Zeno was onto something after all :wink:
     
  13. Nov 29, 2018 #12

    russ_watters

    User Avatar

    Staff: Mentor

    Yes, it's true that 1,000 centrifuges can enrich as much uranium in 1 year as 100 can in 10 years. But that only does you good if when you press the red button, the bomb goes off. What's remarkable about the Manhattan project, to me, is that despite the short timeline and nascent science, the program produced two very different, successful designs - one of which they didn't even bother to test full-scale before employing....and no significant failures.

    We're long past the point where if fusion were as easy as fission it should have happened already. If anything, longer timelines should increase likelihood of success, not decrease it. A small group of The Greatest Scientists in History has only a finite amount of brain power and waking hours and a 10x funding rate can't produce 10x the theoretical development. And working under threat of Armageddon makes people work faster, but not necessarily more accurately. More time and more great scientists should result in fusion projects being more likely to succeed. To me, this success discrepancy is all about fusion just plain being a lot harder to do than fission.

    For a program that is all failures, what a faster funding rate gives you is more frequent failures. This is fine if the failures are leading you to an eventual success, but the problem is that we don't know if it is going to take 100, 1,000 or 1,000,000 failures before the program succeeds -- or even if it ever will. It's really hard to convince people to spend money without a good idea of the odds of success.
     
  14. Nov 29, 2018 #13

    russ_watters

    User Avatar

    Staff: Mentor

    Let me put a finer point on this:
    In 1977, the hot project was the Tokomak Fusion Test Reactor:
    https://en.wikipedia.org/wiki/Tokamak_Fusion_Test_Reactor

    It's probably also the project I was hearing about when I learned that fusion was just around the corner when I was in elementary school in the '80s....along with the complimentary article of faith that we'd be out of oil in 20 years.

    Anyway, it was conceived in 1974, funded in 1975 and construction started in 1980. It was up and running heavy by the mid-80s and was fairly confidently expected to achieve break-even from conception through the late-80s . That's your 10 year timeframe as of 1977. What happened was it simply failed. It had nothing to do with funding: it just didn't work.

    Since then, subsequent projects have also failed.

    Moving forward, we can speculate that a faster rate of failed projects could have paved the way to the inevitable success by now, but we just don't know how many failures it will take so we can't know how soon success could happen, even with vastly higher funding. But if anything, scientists are less confident that the next projects will succeed today than they were in 1977.
     
    Last edited: Nov 29, 2018
  15. Nov 29, 2018 #14

    phyzguy

    User Avatar
    Science Advisor

    For what it's worth, I agree with russ_waters and with the above statement in the OP. More funding would have helped, but it has proven to be a much more difficult problem than originally anticipated. As an example, see this statement from this Wikipedia page:

    "When the topic of controlled fusion was first being studied, it was believed that the plasmas would follow the classical diffusion rate, and this suggested that useful confinement times would be relatively easy to achieve.

    However, in 1949 a team studying plasma arcs as a method of isotope separation found that the diffusion time was much greater than what was predicted by the classical method. David Bohm suggested it scaled with B. If this is true, Bohm diffusion would mean that useful confinement times would require impossibly large fields."
     
  16. Nov 29, 2018 #15

    PeterDonis

    Staff: Mentor

    I think you mean "found that the diffusion rate was much greater than what was predicted by the classical method", yes?
     
  17. Nov 29, 2018 #16

    phyzguy

    User Avatar
    Science Advisor

    It was a quote I copied from Wikipedia, but yes, you're right. However, the point still stands. Fusion plasmas continue to find unanticipated ways to escape our confinement schemes.
     
  18. Nov 29, 2018 #17

    etudiant

    User Avatar
    Gold Member

    This discussion seems a little pessimistic to me.
    The requirement is to make the plasma sufficiently dense and sufficiently hot for a sufficient amount of time to allow sufficient numbers of fusion reactions to take place on a sustainable basis. Iirc, the overall measure 'density x confinement x temperature' has increased pretty steadily since the 1950s, by about a factor of 10 per decade. While the best results to date are still at least two orders of magnitude short of what is needed, the goal is not nebulous or out of reach.

    What is less clear is whether this plasma bottle concept is economical. Of course, given the eagerness with which entire countries such as Germany are embracing absurdly uneconomic power concepts such as wind and solar in a region where steady winds are are and sunshine limited, such considerations may be irrelevant.
     
  19. Nov 29, 2018 #18

    phyzguy

    User Avatar
    Science Advisor

    No one is arguing that progress isn't being made, it certainly is. The point I and others have been making is that it has proven more difficult than many people originally thought. That doesn't mean we should quit working on it. In fact, to me one of the more exciting avenues is the use of high-temperature superconductors. The power density from a fusion plasma scales as the fourth power of the magnetic field. So if we can double the magnetic field, we can reduce the size of the reactor by a factor of 16. It looks like with high temperature superconductors, large increases in the magnetic field (more than a factor of 2) can be made. This has a big impact on the economics of fusion power.
     
  20. Nov 30, 2018 #19
    But isn't it the case that simply making some net fusion output will not be the "milestone" because given the cost and complexity of a typically proposed reactor like a tokamak on that size and field strength etc, in other words the net power output and reliability has to be that much greater in order for any investor to even consider such a plant being able to even return the investment.

    on the line of though of russ_watters is it possible that even after breakeven fusion will be capable of being economical in our lifetimes?
    IIRC unlike with fission in fusion even after you say make a breakeven 1:1 reactor to push it further over the net gain territory you still have to add B field strength for confinement and still increase confinement time and other factors in order to gain the next couple of % of energy return over the input ?
     
  21. Nov 30, 2018 #20

    russ_watters

    User Avatar

    Staff: Mentor

    Could you please provide a reference for that parameter so I can read more about the history, current state and projections. [edit: oh, that's "triple product"]

    How do you know that 2 more orders of magnitude will get the job done? Three decades ago, didn't they think the requirement was 5 orders of magnitude lower?

    It's not just the failures that are bothersome, it's that they don't know why they are failing and worse sometimes confidently predict success before proving to be at least several orders of magnitude wrong.

    [edit2] What the string of failures has also done is reduce the commercial viability of fusion energy: each next reactor has to be bigger and thus more expensive than the previous. We already have trouble with the scale of $10B fission plants. What if fusion plants have to cost $20B and produce an 8th as much energy? (ITER, if it works and doesn't go further over budget)
     
    Last edited: Nov 30, 2018
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted