Moore's law is quickly becoming obsolete, what will they to next?

In summary: The article says that the smallest current retail microprocessor is 32nm in fabrication. Electronic engineers are struggling to make transistors smaller and smaller as the laws of physics dictate that there is a limit to how small you can make a transistor. Later this year, consumer hardware manufacturers such as Intel, AMD, NVidia and ATI, are making 28nm chips. Making them any smaller than that could be extremely difficult.Other than that, the way I see things going is to go parallel or develop further in that direction rather than trying to up the GHz measurement. Either way I agree the challenges ahead are interesting and it will be just as interesting to see where things go.
  • #1
FishmanGeertz
190
0
The smallest current retail microprocessor is 32nm (nanometers) in fabrication. Electronic engineers are struggling to make transistors smaller and smaller as the laws of physics dictate that there is a limit to how small you can make a transistor. Later this year, consumer hardware manufacturers such as Intel, AMD, NVidia and ATI, are making 28nm chips. Making them any smaller than that could be extremely difficult.

What will electronic engineers and computer giants do to further increase the performance and efficiency of microprocessors after the absolute ceiling of Moore's law has been reached? I've read stories about how they're actually going to try and stack transistors atop of each other. There are also some other theoretical technologies and methods of improving computational performance and efficiency in microchips without making the chip smaller and smaller.

I'm sure they have plenty of ideas of their sleeves that we don't know about.

Ballistic deflection transistors is one of them. Instead of using electrical currents to switch transistors on and off, BDT uses tiny pulses of light. This technology could allow microprocessors to theoretically run at speeds of Terahertz and generate very low amounts of heat. We struggle to cool our CPU's at speeds past 3.0 GHz, which is why overclocking past that requires radical cooling.

Imagine having an 80-core CPU with each core running at something like 40.0 THz using an application which can actually fully utilize such awesome computational horsepower. Crysis 3? Not to mention other computer components could be made from the same technology, such as graphics cards.

Here is a link about some good info regarding ballistic deflection transistors.

http://www.rochester.edu/news/show.php?id=2585

So far, nobody has been able to make a working prototype or "proof of concept" model.
 
Last edited:
Computer science news on Phys.org
  • #2
I don't know if its feasible, but they might start looking into things like using materials like diamonds although I don't know what the transistor equivalent will be.

Other than that, the way I see things going is to go parallel or develop further in that direction rather than trying to up the GHz measurement.

Either way I agree the challenges ahead are interesting and it will be just as interesting to see where things go.
 
  • #3
FishmanGeertz said:

The article you linked to is 5 years old. And it's from the university that did the research. They just wanted publicity to get more funding.

Part of the article is completely misleading:
The BDT is "ballistic" because it is made from a sheet of semiconductor material called a "2D electron gas," which allows the electrons to travel without hitting impurities, which would impede the transistor's performance.
A 2D electron gas is not a sheet of semiconductor material, it's a 2D electron gas. And a 2DEG does not necessarily lead to ballistic transport. The channel in a HEMT or even MOSFET is a 2DEG, but they can operate in both ballistic and non-ballistic regimes, depending on gate length and many other factors.

FishmanGeertz said:
The smallest current retail microprocessor is 32nm (nanometers) in fabrication. Electronic engineers are struggling to make transistors smaller and smaller as the laws of physics dictate that there is a limit to how small you can make a transistor.
chiro said:
Either way I agree the challenges ahead are interesting and it will be just as interesting to see where things go.
You guys make it sound like it's been a walk in the park until now. It has always been a struggle. The biggest one is to make the manufacturing process cheap and reliable. Finding a device that works is not the issue. Finding a device that works, can be manufactured at a reasonable cost, at high throughput, with good reliability and repeatability is the real issue.
 
Last edited:
  • #4
I remember a little before the turn of the millennium someone who I believe was at IBM came out with a paper claiming that in the next ten years transistor counts and storage would go up a hundred million million fold and clock rates would go up or timing would go down by a hundred million fold.

Now in the prior 30 years clock rates had gone up a thousand fold, from Mhz to Ghz and storage has gone up a million fold, from kilobytes to gigabytes.

Had this person's predictions been correct then by last year you would have been able to stroll down to the local computer store and buy billion gigabyte sticks of memory, perhaps hundred billion gigabyte hard drives and million gigahertz processors, and still be a little under the prediction.

Half a decade ago I was still able to find that paper with a search, but now I can't seem to locate it. I wish someone could locate that paper again.

What I wonder why they do not do is for each market segment identify "the sweet spot" for the amount of main memory for the CPU for that market. I believe this number is not too difficult to come up with and probably changes less rapidly than the CPU itself. Then make the CPU package larger and stuff inside on a separate chip, because ram and processor don't economically work out on the same slab of silicon, that sweet amount of memory. The memory would then be 1mm away from the processor, not 10cm. The processor manufacturers could even get the ram manufacturers to customize parts for them. If it made sense they could have 128 or 256 or even 1024 bit wide data paths to memory. And all this would be hidden, unlike the RAMBUS catastrophe. By selecting the appropriate amount of memory for each market I'd guess 99.99% of the memory accesses would never have to leave the CPU package, pin counts might even go down, motherboard design and memory drive chipsets might be substantially eased.
 
Last edited:
  • #5
On Wikipedia, there are plans for Intel and other companies to go as small as 11nm. After that, there is a big gray area which points toward nanotechnology. This is going to be in well over a decade, so I'm not really worried about it at all.
 
  • #6
Futurama said:
On Wikipedia, there are plans for Intel and other companies to go as small as 11nm. After that, there is a big gray area which points toward nanotechnology. This is going to be in well over a decade, so I'm not really worried about it at all.

Please explain how nanotechnology will be implemented into microprocessors and other electronic circuits.
 
  • #7
FishmanGeertz said:
Please explain how nanotechnology will be implemented into microprocessors and other electronic circuits.

No one knows yet. We'll figure it out when we get there, just as we have before.
 
  • #8
TylerH said:
No one knows yet. We'll figure it out when we get there, just as we have before.

We'll figure it out. Right?
 
  • #9
The issue isn't just a case of making smaller transistors, it's the combination of keeping component switching rates high (3ghz to 4ghz), while also shrinking component size. Fast switching rates require relatively high voltage for the size of the components, which generates heat. If the components are packed tightly together, then heat dissappation is an issue, and that is the main limitation with current chip designs.
 
  • #10
rcgldr said:
The issue isn't just a case of making smaller transistors, it's the combination of keeping component switching rates high (3ghz to 4ghz), while also shrinking component size. Fast switching rates require relatively high voltage for the size of the components, which generates heat. If the components are packed tightly together, then heat dissappation is an issue, and that is the main limitation with current chip designs.

Yea, if we could get rid of current leak in the transistors, we would be pretty well set.
 
  • #11
FishmanGeertz said:
The smallest current retail microprocessor is 32nm (nanometers) in fabrication. Electronic engineers are struggling to make transistors smaller and smaller as the laws of physics dictate that there is a limit to how small you can make a transistor. Later this year, consumer hardware manufacturers such as Intel, AMD, NVidia and ATI, are making 28nm chips. Making them any smaller than that could be extremely difficult.
Yes, it's definitely conceivable that we're getting close to saturation. While that feeling has been around for over a decade now, we may soon (in the next decade or two?) be approaching the regime where we transition from having to worry about technical challenges to having to worry about fundamental quantum limitations.

What will electronic engineers and computer giants do to further increase the performance and efficiency of microprocessors after the absolute ceiling of Moore's law has been reached?
I'm aware of two lines of investigation: (i) nanowire (and suchtypes) based FETs, and (ii) replacements for silicon based CMOS technology (e.g., Oxide based Mott transistors).

For logic, spintronics may eventually emerge as a real possibility, but I think that's still a fair ways off.

A google search for "beyond CMOS" should throw up some useful results.
 
  • #12
Google MLDRAM, and then tell Google NO, you really did mean MLDRAM, to see information on using 4 voltage levels instead of 2 to store twice as much information in the same space.

On a wilder note, G. Spencer Brown and his Laws Of Form would surface about every five years with claims that his methods would reduce the number of transistors and gates needed to implement processors by a large fraction, perhaps even an order of magnitude. But each time it was always almost ready and the last bits needed to finish the patents were almost done. To be fair, he almost certainly accomplished more than I will in a lifetime. He got big players, like Digital Equipment, to take him seriously and pay him money. I haven't heard about him the last couple of five year cycles, but I am out of touch. It might be interesting to track down where that stands. Wiki has pages on him and on his Laws of Form.
 

1. What is Moore's Law and why is it becoming obsolete?

Moore's Law is an observation made by Intel co-founder Gordon Moore in 1965 stating that the number of transistors on a microchip will double approximately every two years, leading to exponential growth in computing power. However, as transistors reach the atomic scale, this trend is no longer sustainable, making Moore's Law obsolete.

2. How has Moore's Law impacted technology?

Moore's Law has been the driving force behind the rapid advancement of technology, enabling smaller, faster, and more powerful devices. It has led to the development of smartphones, laptops, and other high-performance devices that we use daily.

3. What will happen now that Moore's Law is becoming obsolete?

As Moore's Law reaches its limit, the rate of technological advancement will slow down. This means that we will no longer see a doubling of computing power every two years. However, this does not mean that technology will stop advancing altogether.

4. What are the alternatives to Moore's Law?

There are several alternatives being explored to continue the trend of improving computing power. This includes using new materials such as graphene, developing new architectures such as quantum computing, and improving the efficiency of current technologies through better software and hardware optimization.

5. How will the end of Moore's Law affect the economy and society?

The end of Moore's Law will have a significant impact on the economy and society. As the rate of technological advancement slows down, companies may struggle to keep up with the competition, and consumers may see a decrease in the rate of new and improved products. However, it also provides an opportunity for new technologies to emerge and create new industries and job opportunities.

Similar threads

Replies
37
Views
5K
  • Programming and Computer Science
Replies
5
Views
10K
Back
Top