Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Transcending Moore's law?

  1. Nov 26, 2014 #1
    The scientific limit on to how small you can make a functionally viable transistor is very fast approaching and should hit a stone wall within the next 10 years or less. How will electronic engineers and computer scientists compensate for this problem?

    Without some revolutionary breakthrough with the design of and fundamental basics of microprocessor chips, I don't see how they can do this. Are there any workarounds on the table being discussed and researched for this issue?

    This is why commercially-available CPU's from Intel and AMD have gone in the direction of energy efficiency instead of sheer performance. It's becoming very difficult to squeeze more and more performance out of each generation of microarchitecture without improvising like tweaking instruction sets, adding more cache, and making memory bandwidth and motherboard chipset improvements.
     
    Last edited: Nov 26, 2014
  2. jcsd
  3. Nov 26, 2014 #2

    rcgldr

    User Avatar
    Homework Helper

    Unless there's some breakthrough in the future, once some type of physical limit is reached, there's not much that can be done. I'm not sure if the limit is due to transistor size itself or in the process of producing a chip. It seems that current methods include having more layers on a chip, but then there needs to be a way to dissipate the heat for the inner layers.

    Another issue is that in order to increase speed, you need a higher voltage to gate size ratio, but this creates a localized heat problem that is easiest to solve by reducing gate density (more space between gates to dissipate heat), but lower density means larger chips and/or reduced gate count which translates into increased cost. This is why consumer oriented processors have been stuck at around 4 ghz unless liquid cooling is used.
     
    Last edited: Nov 26, 2014
  4. Nov 26, 2014 #3
    Do you think it's possible that quantum computers will be available for retail purchase and be commonplace in most households at some point probably in the distant futrue?

    Also, is it possible to make a room-temperature quantum computer? Currently, these computers have to be cooled to a temperature almost absolute zero with liquid helium in order to maintain the quantum coherence of the qubits. This would be highly impractical for household use.

    However, topological insulators in quantum processors would solve this issue in theory. But actually creating a topological quantum computer has proven to be a daunting challenge for scientists and engineers.

    Quantum computing is still in it's infancy and won't be perfected for quite some time.
     
    Last edited: Nov 26, 2014
  5. Dec 1, 2014 #4
    >Quantum computing is still in it's infancy and won't be perfected for quite some time.

    Seems to be the sum of it for now, apparently there are mounting examples of quantum effects in nature so it does indeed seem possibly to perform quantum computation at a much higher temperature. I believe the efficiency of photosynthesis is put down to quantum effects.
     
  6. Dec 1, 2014 #5
    Well, there are certainly things you can still do at the level of the CPU's architecture or the implementation before we start getting into technology that is still highly speculative and experimental and won't be available to the end user for a long time, if ever. The individual components on the IC can only get so small, but that doesn't mean you can't just keep adding more of them to a more efficient architecture, or have better cooling and more efficient software.
     
  7. Dec 1, 2014 #6

    jim mcnamara

    User Avatar

    Staff: Mentor

    Commodity consumer cpu chips - what the discussion is about - are benefitting from improved compilers. Intel (x86) and Fujitsu (sparc) have special opcodes and specialty libraries to take full advantage of a given cpu architecture. Since gcc (Linux/Android/Consumer electronics) has made strides in this arena, Microsoft is very likely doing just that - improving Visual Studio compilers.
     
  8. Dec 1, 2014 #7

    Mark44

    Staff: Mentor

    As I understand things, the "wall" is not 10 years out - it has already happened. The CPUs haven't increased in clock speed for the past several years, and seem to be running at a maximum of about 3 - 3.5 GHz. Feature size inside a chip is closely tied to how fast the chip can run, as the smaller the individual components inside a chip, the closer together they are, and the faster information can be transferred.

    To compensate for this inability to produce chips at finer resolutions, manufacturers such as Intel and AMD have put putting more CPUs in a single chip. My current desktop, which I bought about a year ago, is running a Quad Core Duo processor, which has eight virtual CPUs.
     
    Last edited: Dec 1, 2014
  9. Dec 5, 2014 #8
    The problem with transcending Moore's law is that there isn't really a very pressing need yet. Sure, some industries need a whole whopping amount of computing power, but I believe necessity really is the mother of invention. When things really started to take off in the 1970s, there were a lot of strides that needed to be made to get computers into the hands of the everyday person or to make them good enough to complete a necessary task. People came up with inventive techniques for chip manufacturing and design because there was a need for a faster machine or a more powerful one. The demands of the common user drove software companies to include more features, increasing program and OS size and requiring better machines.

    I would make the argument that the user base of the world at large is satisfied with the speed and capabilities of the modern computer. Most of the development in computing currently is in creating better, leaner, and more intelligent software that takes advantage of the available hardware. In contrast to even 20 years ago, the common user doesn't have to worry about running out of memory or disk space too often. The speed issue today is about how fast we expect things should run, not that things are really too slow (more an issue of impatience than necessity). However, like I said before, there are many companies out there trying to solve problems of immense complexity and require more computing power, but I would argue they are the only ones really driving the increases in hardware tech.

    We'll only get around our limitations when transistors stop being the base technology of computing. It's really only been 75 years since Shannon published his thesis linking Boolean algebra to computing. The transistor was only invented 65 years ago. We've been basically iterating on that train of thought since then and we may have already reached the end of the tracks. There might be other directions computing can take that we haven't ever thought of yet. Quantum computing is a great advance in a slightly different direction and I hope it takes off. However, I think that only when we really need the computing power will we see that kind of computing growth rate again.
     
  10. Dec 5, 2014 #9

    nsaspook

    User Avatar
    Science Advisor

    The 'wall' is still out there with what can be done even with old man silicon. The Intels of the world have to make money off the massive investments in current technology.
    http://semiengineering.com/will-7nm-and-5nm-really-happen/
     
  11. Dec 5, 2014 #10

    Mark44

    Staff: Mentor

    Also, I was reading up on the next generation of Intel CPUs - Haswell architecture. I don't recall the feature size, but the clock rate was around 4.6 Ghz or so.

    I remember reading some time back that a big concern with the smaller feature size was quantum effects due to electrons "tunneling" to different levels. I haven't seen anything about that lately, but then again, haven't been following that closely.
     
  12. Dec 6, 2014 #11
    I read in Maximum PC magazine that AMD plans on releasing an air-cooled quad core chip with a default clock speed of 5GHz.

    Not sure how they can get away with air cooling @5GHz, that's usually the cutoff for requiring liquid nitrogen.

    AMD's flagship CPU is usually only about half as fast as Intel's.
     
  13. Jan 2, 2015 #12

    TheDemx27

    User Avatar
    Gold Member

    IMHO I don't think quantum computing will really be pushing moore's law. From what I've heard quantum computers aren't really that great for applications we are used to and are only good for doing certain operations i.e. finding prime factors. I can't see why the quantum computer would become more than a commercial product. There isn't any reason to put a quantum computer in the hands of the average Joe, as he won't have many uses for it.
     
  14. Jan 2, 2015 #13

    phinds

    User Avatar
    Gold Member
    2016 Award

    Right. And Bill Gates was SURE that 64K would be all the memory anyone would ever need. It was just inconceivable that more could be required for a single person.
     
    Last edited: Jan 2, 2015
  15. Jan 2, 2015 #14

    SteamKing

    User Avatar
    Staff Emeritus
    Science Advisor
    Homework Helper

    At one time, the chairman of IBM (!) predicted that the market for general purpose computers was too limited to warrant investment by the company into their production, or so the story goes. Large mainframe computers quickly evolved into more affordable mini-computers, thence to super minis, work stations, and finally desktops, portables, laptops, and tablets, all within one lifetime. Sure, a lot of the things we use computers for are completely mundane, but that doesn't mean that society hasn't changed as a result.

    Right now, the applications of quantum computing may seem non-existent, but once they are built, who knows? They could lead to a Hacker Apocalypse, where no conventional system is safe from being plundered.
     
  16. Jan 3, 2015 #15

    SixNein

    User Avatar
    Gold Member

    The problem of course is quantum tunneling.

    http://en.wikipedia.org/wiki/Quantum_tunnelling

    At any rate, we will adjust by making things more efficient programming wise on current architectures.

    The next big speedup won't come from changes to CPU frequencies; instead, it will come from new architectures. Architectures based on the brain for example could be very powerful.

    The current chips are much faster than the human brain, but the human brain rapes chips on parallel processing. And it does it on very little energy.
     
  17. Jan 3, 2015 #16
    A quantum processor would be like a graphics processor. It would be tasked by a regular computer to solve problems it was exceptionally good at. There is no reason to think that the applications for a $2000 QM processor would be exclusively commercial.
     
  18. Jan 3, 2015 #17
    I have been a software engineer for over four decades - before the term was invented. There has never been the perception of a "pressing need". Never-the-less, additional processing power and memory capacity has always been exploited.
     
  19. Jan 4, 2015 #18
    What about topological quantum computing? As in room-temperature quantum computers without having to cool them to absolute zero degrees in order to maintain the coherence of the qubits.

    Is it currently possible to make topological insulators?
     
  20. Jan 4, 2015 #19
    QM information processing allows certain operations to be performed much faster than conventional computers. However, they are not general purpose in the same way that conventional computers are.

    QM information processing will not represent a extension of time for Moore's Law. it will represent an entirely new direction in data processing.

    As for your technology suggestions (ex, topological insulators), I don't doubt that some technology will be found.
     
  21. Jan 5, 2015 #20
    Okay, so "pressing need" was a probably a bad choice of words. The computer wasn't built for any one specific purpose, so there were no "needs". I concede that point.

    What I meant was that we found more and more uses for computers but computing power has been, up until recently, very limiting. Gaming and print graphics, computer-aided design and engineering tools, computer animation, and other such applications were limited by the lack of computing power available. The gains from Moore's Law were more apparent with each new chipset that came out because there were more applications that could take advantage of the increase in speed. There was more incentive to be creative with chip design and new manufacturing processes. The hardware was more clearly the limitation of the technology, not the software. Contrast that with today. I'm not sure that the common user even notices the speed difference between their old machine and new one. As well, there's a general shift away from personal computers toward laptops, tablets, and smartphones (i.e. less powerful but more mobile hardware). I would argue processor speed for the common user is more now a "convenience" than a "necessity", like a weapon to combat software bloat. We strive for faster machines, but for what? Server farms and virtualization, problems involving Big Data, gaming, and maybe a few other things I'm sure I'm missing, but nothing that the average user concerns themselves with.

    I would contend that unless we get something that the common user will need a heavier-duty processor for, we won't see the kinds of technological leaps that we got during the 1970s-1990s.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Transcending Moore's law?
  1. What is Moore's Law? (Replies: 5)

Loading...