Is Graphene the Future of Computer Technology?

  • Thread starter Thread starter sirchick
  • Start date Start date
  • Tags Tags
    Graphene Silicon
AI Thread Summary
Graphene's potential to replace silicon in computing is hindered by significant obstacles, similar to the challenges faced by quantum computing. While graphene exhibits promising properties, practical applications remain elusive due to issues with current infrastructure and technology compatibility. Historical parallels, such as the long-predicted replacement of silicon by gallium arsenide, highlight the slow progress in adopting new materials. Funding for research and development has been constrained by economic crises, with China emerging as a primary investor in advanced technologies. Achieving high processing speeds, such as 100 GHz, is limited by electrical constraints, necessitating a shift to optical computing or other radical architectural changes. Current advancements in spintronics and optical circuits may offer more immediate benefits compared to graphene, which is still in the early stages of research and development.
sirchick
Messages
51
Reaction score
0
Well I don't read much into this subject on graphene.But i have read there are obstacles to over come... much like quantum computers which have probably a million times more complicated issues to over. I'm skeptical that a quantum computer will exist in the home for 100+ years...

But will graphene replace silicon in our computers any time soon? Can anyone from their knowledge on the technology see a graphene age on the horizon?

What are the big obstacles and how soon can you see it reaching our homes?

I can just imagine the benefits for projects under folding@home would have with 100 GHz cpu's!
 
Computer science news on Phys.org
Graphene is where silicon was in 1948 (the first transistor used Germanium BTW): basically nice in theory and very promising but many practical issues existing that prevented even lab prototype devices from being made. It wasn't even possible to make a reliable silicon transistor until the mid-to-late-1950s.

Will graphene be practical in only 10 years? It's a long shot because of current infrastructure and technologies: if it can't be made with existing technologies or interface to existing technologies (namely silicon ICs) it will be stillborn. To get from point A to point Z you need to traverse B through Y where these dependent on what already is.

An excellent historic parallel exists with GaAs which as being predicted to replace Silicon "real soon now" since the late 1960s! Still hasn't happened. In fact all III-V semiconductors (of which GaAs is about 50%) still account for less than 1% of silicon in both volume and revenue.

The other "real world" issue is the lack of capital to invest in technologies (even nice cool ones like graphene) because of the US financial crisis of 2007 and the current Euro crisis: R&D and start-up funding has been as rare as hen's teeth. The best funding source for any leading edge technology has been China (specifically the Chinese government) in the last 10 years.

Oh and without going to optical computer you will never get a 100 GHz processor - we hit a limit at 3 GHz because of electrical constraints unrelated to transistors themselves but due to physical limits of propagating electrical clock edges across a finite die size.

100 GHz is only possibly by shrinking the density and size down to a grain of sand (which probably keeps performance constant) or switching to some type of optical interconnect and processing (though it still may not allow processing to go much faster in sufficient throughput). The problem is that you quickly run into the same limit again even with light (the difference in speed is only 60%) so instead a more radical architectural change is likely required (i.e. quantum computer or non-Von-Neuman/non-Harvard computing: very out there!).

So I'm not holding my breath on graphene at this point (and I work in the semiconductor industry). I have my money on other technologies instead being more likely.
 
At this point the major drawback concerning graphene are its physical properties, exactly what makes it such a phenomenal substance. Graphene is basically a sheet 1 atom thick of carbon atoms, or an unrolled carbon nano tube(i don't think anyone has actually unrolled a nano tube). Stacking sheets of graphene give you graphite. A very very brittle material.

The answer is similar to why LEOs and the military aren't wear carbon nano tube vests.

To make practical consumer products out of these materials the best known option is to create a matrix with the carbon and another material.

The basic idea is that graphene can be used to thermally or structurally enhance a known material. EX: kevlar-graphene steel-graphene.

This I fear is a long way off, although money talks. Throw some coin at it if you can.
 
I'd say precision engineering is more of an issue in commercial computing then people realize. The entire industry has a very brute force engineering approach reminiscent of 19th century engineers who perfected the steam engine before the law of thermodynamics were invented. Its precisely that kind of precision engineering that makes advances like spintronics and optical computing look much more promising then graphene for the foreseeable future because we already know a great deal more about the properties of other materials.

Although 100ghz sounds exciting optical circuits can already achieve 60Thz. Spintronics are theoretically capable of achieving similar speeds and even room temperature quantum computing. Both also have a great deal more research already behind them then graphene does and when you start talking about multi-billion dollar chip industries investing billions in a single factory they aren't likely to invest in any new and unproven technology without a lot of research behind it to prove its long term value.
 
jsgruszynski said:
Graphene is where silicon was in 1948 (the first transistor used Germanium BTW): basically nice in theory and very promising but many practical issues existing that prevented even lab prototype devices from being made. It wasn't even possible to make a reliable silicon transistor until the mid-to-late-1950s.

Will graphene be practical in only 10 years? It's a long shot because of current infrastructure and technologies: if it can't be made with existing technologies or interface to existing technologies (namely silicon ICs) it will be stillborn. To get from point A to point Z you need to traverse B through Y where these dependent on what already is.

An excellent historic parallel exists with GaAs which as being predicted to replace Silicon "real soon now" since the late 1960s! Still hasn't happened. In fact all III-V semiconductors (of which GaAs is about 50%) still account for less than 1% of silicon in both volume and revenue.

The other "real world" issue is the lack of capital to invest in technologies (even nice cool ones like graphene) because of the US financial crisis of 2007 and the current Euro crisis: R&D and start-up funding has been as rare as hen's teeth. The best funding source for any leading edge technology has been China (specifically the Chinese government) in the last 10 years.

Oh and without going to optical computer you will never get a 100 GHz processor - we hit a limit at 3 GHz because of electrical constraints unrelated to transistors themselves but due to physical limits of propagating electrical clock edges across a finite die size.

100 GHz is only possibly by shrinking the density and size down to a grain of sand (which probably keeps performance constant) or switching to some type of optical interconnect and processing (though it still may not allow processing to go much faster in sufficient throughput). The problem is that you quickly run into the same limit again even with light (the difference in speed is only 60%) so instead a more radical architectural change is likely required (i.e. quantum computer or non-Von-Neuman/non-Harvard computing: very out there!).

So I'm not holding my breath on graphene at this point (and I work in the semiconductor industry). I have my money on other technologies instead being more likely.

Huh... never thought of it this way.
Under perfect conditions for 4GHz you get about 7.5 cm of leeway before your clocks start intersecting. :)
But yea, graphene is a nice dream, but not something I expect to see anytime soon.
Quantum computing is probably ever further in the future however.
(incidentally - yay for my 100th post)
 
There are graphine chips, IBM has them. I don't think its a product issue, but rather that they have classified applications at present and a gap must be maintained between civilian and military hardware.
 
In my discussions elsewhere, I've noticed a lot of disagreement regarding AI. A question that comes up is, "Is AI hype?" Unfortunately, when this question is asked, the one asking, as far as I can tell, may mean one of three things which can lead to lots of confusion. I'll list them out now for clarity. 1. Can AI do everything a human can do and how close are we to that? 2. Are corporations and governments using the promise of AI to gain more power for themselves? 3. Are AI and transhumans...
Thread 'ChatGPT Examples, Good and Bad'
I've been experimenting with ChatGPT. Some results are good, some very very bad. I think examples can help expose the properties of this AI. Maybe you can post some of your favorite examples and tell us what they reveal about the properties of this AI. (I had problems with copy/paste of text and formatting, so I'm posting my examples as screen shots. That is a promising start. :smile: But then I provided values V=1, R1=1, R2=2, R3=3 and asked for the value of I. At first, it said...

Similar threads

Back
Top