Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Will graphene replace silicon?

  1. Jan 2, 2012 #1
    Well I don't read much into this subject on graphene.

    But i have read there are obstacles to over come.... much like quantum computers which have probably a million times more complicated issues to over. I'm skeptical that a quantum computer will exist in the home for 100+ years...

    But will graphene replace silicon in our computers any time soon? Can any one from their knowledge on the technology see a graphene age on the horizon?

    What are the big obstacles and how soon can you see it reaching our homes?

    I can just imagine the benefits for projects under folding@home would have with 100 GHz cpu's!
  2. jcsd
  3. Jan 4, 2012 #2
    Graphene is where silicon was in 1948 (the first transistor used Germanium BTW): basically nice in theory and very promising but many practical issues existing that prevented even lab prototype devices from being made. It wasn't even possible to make a reliable silicon transistor until the mid-to-late-1950s.

    Will graphene be practical in only 10 years? It's a long shot because of current infrastructure and technologies: if it can't be made with existing technologies or interface to existing technologies (namely silicon ICs) it will be stillborn. To get from point A to point Z you need to traverse B through Y where these dependent on what already is.

    An excellent historic parallel exists with GaAs which as being predicted to replace Silicon "real soon now" since the late 1960s!!! Still hasn't happened. In fact all III-V semiconductors (of which GaAs is about 50%) still account for less than 1% of silicon in both volume and revenue.

    The other "real world" issue is the lack of capital to invest in technologies (even nice cool ones like graphene) because of the US financial crisis of 2007 and the current Euro crisis: R&D and start-up funding has been as rare as hen's teeth. The best funding source for any leading edge technology has been China (specifically the Chinese government) in the last 10 years.

    Oh and without going to optical computer you will never get a 100 GHz processor - we hit a limit at 3 GHz because of electrical constraints unrelated to transistors themselves but due to physical limits of propagating electrical clock edges across a finite die size.

    100 GHz is only possibly by shrinking the density and size down to a grain of sand (which probably keeps performance constant) or switching to some type of optical interconnect and processing (though it still may not allow processing to go much faster in sufficient throughput). The problem is that you quickly run into the same limit again even with light (the difference in speed is only 60%) so instead a more radical architectural change is likely required (i.e. quantum computer or non-Von-Neuman/non-Harvard computing: very out there!).

    So I'm not holding my breath on graphene at this point (and I work in the semiconductor industry). I have my money on other technologies instead being more likely.
  4. Jan 13, 2012 #3
    At this point the major drawback concerning graphene are its physical properties, exactly what makes it such a phenomenal substance. Graphene is basically a sheet 1 atom thick of carbon atoms, or an unrolled carbon nano tube(i dont think anyone has actually unrolled a nano tube). Stacking sheets of graphene give you graphite. A very very brittle material.

    The answer is similar to why LEOs and the military aren't wear carbon nano tube vests.

    To make practical consumer products out of these materials the best known option is to create a matrix with the carbon and another material.

    The basic idea is that graphene can be used to thermally or structurally enhance a known material. EX: kevlar-graphene steel-graphene.

    This I fear is a long way off, although money talks. Throw some coin at it if you can.
  5. Jan 15, 2012 #4
    I'd say precision engineering is more of an issue in commercial computing then people realize. The entire industry has a very brute force engineering approach reminiscent of 19th century engineers who perfected the steam engine before the law of thermodynamics were invented. Its precisely that kind of precision engineering that makes advances like spintronics and optical computing look much more promising then graphene for the foreseeable future because we already know a great deal more about the properties of other materials.

    Although 100ghz sounds exciting optical circuits can already achieve 60Thz. Spintronics are theoretically capable of achieving similar speeds and even room temperature quantum computing. Both also have a great deal more research already behind them then graphene does and when you start talking about multi-billion dollar chip industries investing billions in a single factory they aren't likely to invest in any new and unproven technology without a lot of research behind it to prove its long term value.
  6. Apr 13, 2012 #5
    Huh... never thought of it this way.
    Under perfect conditions for 4GHz you get about 7.5 cm of leeway before your clocks start intersecting. :)
    But yea, graphene is a nice dream, but not something I expect to see anytime soon.
    Quantum computing is probably ever further in the future however.
    (incidentally - yay for my 100th post)
  7. May 10, 2012 #6
    There are graphine chips, IBM has them. I don't think its a product issue, but rather that they have classified applications at present and a gap must be maintained between civilian and military hardware.
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook