Will graphene replace silicon?

In summary, Graphene is a promising material with potential to surpass silicon in computing technology. However, there are currently many practical issues and obstacles that prevent its widespread use, such as its physical properties and the need for new infrastructure and technologies. Other technologies, such as spintronics and optical computing, have more research and investment behind them, making them more likely candidates for practical use in the near future.
  • #1
sirchick
51
0
Well I don't read much into this subject on graphene.But i have read there are obstacles to over come... much like quantum computers which have probably a million times more complicated issues to over. I'm skeptical that a quantum computer will exist in the home for 100+ years...

But will graphene replace silicon in our computers any time soon? Can anyone from their knowledge on the technology see a graphene age on the horizon?

What are the big obstacles and how soon can you see it reaching our homes?

I can just imagine the benefits for projects under folding@home would have with 100 GHz cpu's!
 
Computer science news on Phys.org
  • #2
Graphene is where silicon was in 1948 (the first transistor used Germanium BTW): basically nice in theory and very promising but many practical issues existing that prevented even lab prototype devices from being made. It wasn't even possible to make a reliable silicon transistor until the mid-to-late-1950s.

Will graphene be practical in only 10 years? It's a long shot because of current infrastructure and technologies: if it can't be made with existing technologies or interface to existing technologies (namely silicon ICs) it will be stillborn. To get from point A to point Z you need to traverse B through Y where these dependent on what already is.

An excellent historic parallel exists with GaAs which as being predicted to replace Silicon "real soon now" since the late 1960s! Still hasn't happened. In fact all III-V semiconductors (of which GaAs is about 50%) still account for less than 1% of silicon in both volume and revenue.

The other "real world" issue is the lack of capital to invest in technologies (even nice cool ones like graphene) because of the US financial crisis of 2007 and the current Euro crisis: R&D and start-up funding has been as rare as hen's teeth. The best funding source for any leading edge technology has been China (specifically the Chinese government) in the last 10 years.

Oh and without going to optical computer you will never get a 100 GHz processor - we hit a limit at 3 GHz because of electrical constraints unrelated to transistors themselves but due to physical limits of propagating electrical clock edges across a finite die size.

100 GHz is only possibly by shrinking the density and size down to a grain of sand (which probably keeps performance constant) or switching to some type of optical interconnect and processing (though it still may not allow processing to go much faster in sufficient throughput). The problem is that you quickly run into the same limit again even with light (the difference in speed is only 60%) so instead a more radical architectural change is likely required (i.e. quantum computer or non-Von-Neuman/non-Harvard computing: very out there!).

So I'm not holding my breath on graphene at this point (and I work in the semiconductor industry). I have my money on other technologies instead being more likely.
 
  • #3
At this point the major drawback concerning graphene are its physical properties, exactly what makes it such a phenomenal substance. Graphene is basically a sheet 1 atom thick of carbon atoms, or an unrolled carbon nano tube(i don't think anyone has actually unrolled a nano tube). Stacking sheets of graphene give you graphite. A very very brittle material.

The answer is similar to why LEOs and the military aren't wear carbon nano tube vests.

To make practical consumer products out of these materials the best known option is to create a matrix with the carbon and another material.

The basic idea is that graphene can be used to thermally or structurally enhance a known material. EX: kevlar-graphene steel-graphene.

This I fear is a long way off, although money talks. Throw some coin at it if you can.
 
  • #4
I'd say precision engineering is more of an issue in commercial computing then people realize. The entire industry has a very brute force engineering approach reminiscent of 19th century engineers who perfected the steam engine before the law of thermodynamics were invented. Its precisely that kind of precision engineering that makes advances like spintronics and optical computing look much more promising then graphene for the foreseeable future because we already know a great deal more about the properties of other materials.

Although 100ghz sounds exciting optical circuits can already achieve 60Thz. Spintronics are theoretically capable of achieving similar speeds and even room temperature quantum computing. Both also have a great deal more research already behind them then graphene does and when you start talking about multi-billion dollar chip industries investing billions in a single factory they aren't likely to invest in any new and unproven technology without a lot of research behind it to prove its long term value.
 
  • #5
jsgruszynski said:
Graphene is where silicon was in 1948 (the first transistor used Germanium BTW): basically nice in theory and very promising but many practical issues existing that prevented even lab prototype devices from being made. It wasn't even possible to make a reliable silicon transistor until the mid-to-late-1950s.

Will graphene be practical in only 10 years? It's a long shot because of current infrastructure and technologies: if it can't be made with existing technologies or interface to existing technologies (namely silicon ICs) it will be stillborn. To get from point A to point Z you need to traverse B through Y where these dependent on what already is.

An excellent historic parallel exists with GaAs which as being predicted to replace Silicon "real soon now" since the late 1960s! Still hasn't happened. In fact all III-V semiconductors (of which GaAs is about 50%) still account for less than 1% of silicon in both volume and revenue.

The other "real world" issue is the lack of capital to invest in technologies (even nice cool ones like graphene) because of the US financial crisis of 2007 and the current Euro crisis: R&D and start-up funding has been as rare as hen's teeth. The best funding source for any leading edge technology has been China (specifically the Chinese government) in the last 10 years.

Oh and without going to optical computer you will never get a 100 GHz processor - we hit a limit at 3 GHz because of electrical constraints unrelated to transistors themselves but due to physical limits of propagating electrical clock edges across a finite die size.

100 GHz is only possibly by shrinking the density and size down to a grain of sand (which probably keeps performance constant) or switching to some type of optical interconnect and processing (though it still may not allow processing to go much faster in sufficient throughput). The problem is that you quickly run into the same limit again even with light (the difference in speed is only 60%) so instead a more radical architectural change is likely required (i.e. quantum computer or non-Von-Neuman/non-Harvard computing: very out there!).

So I'm not holding my breath on graphene at this point (and I work in the semiconductor industry). I have my money on other technologies instead being more likely.

Huh... never thought of it this way.
Under perfect conditions for 4GHz you get about 7.5 cm of leeway before your clocks start intersecting. :)
But yea, graphene is a nice dream, but not something I expect to see anytime soon.
Quantum computing is probably ever further in the future however.
(incidentally - yay for my 100th post)
 
  • #6
There are graphine chips, IBM has them. I don't think its a product issue, but rather that they have classified applications at present and a gap must be maintained between civilian and military hardware.
 

1. Will graphene replace silicon in all electronic devices?

No, graphene is not likely to completely replace silicon in all electronic devices. While graphene has many promising properties, it also has some limitations that may make it unsuitable for certain applications. Additionally, silicon has been the backbone of the electronics industry for decades and has a well-established infrastructure and manufacturing processes.

2. What advantages does graphene have over silicon?

Graphene has several advantages over silicon, including its high conductivity, flexibility, and transparency. It is also much thinner and lighter than silicon, making it ideal for use in wearable devices and flexible screens. Additionally, graphene has a higher electron mobility, meaning it can carry electrical charge more quickly and efficiently.

3. Can graphene be used to make computer processors?

Yes, graphene has the potential to be used in computer processors. Its high conductivity and fast electron mobility make it a promising material for high-speed computing. However, there are still challenges to be overcome in terms of manufacturing and integration with existing technology.

4. Is graphene more expensive than silicon?

Currently, graphene is more expensive to produce than silicon. However, as research and development on graphene continue, it is expected that production costs will decrease. Additionally, the potential for graphene to be used in a wider range of applications may make it a more cost-effective option in the long run.

5. Are there any environmental concerns with using graphene instead of silicon?

There are some potential environmental concerns with using graphene, as it is a relatively new material and its long-term effects are not yet fully understood. Additionally, the production process for graphene can be energy-intensive and may produce harmful byproducts. However, efforts are being made to develop more sustainable and environmentally friendly methods of producing graphene.

Similar threads

Replies
8
Views
2K
Replies
3
Views
935
  • Programming and Computer Science
Replies
13
Views
1K
  • Computing and Technology
Replies
30
Views
2K
  • Mechanical Engineering
Replies
2
Views
1K
Replies
2
Views
2K
  • Computing and Technology
Replies
7
Views
2K
  • Computing and Technology
Replies
9
Views
1K
Replies
7
Views
2K
  • Electrical Engineering
Replies
6
Views
2K
Back
Top