Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Why the laws of physics will eventually decay Moore's Law

  1. Mar 6, 2013 #1
    I was reading an article and it said that eventually, when you get silicon transistors to a certain size, they wont be able to operate anymore and will end up melting. I have always wondered the following... what is the point of trying to make transistors smaller when we can just make the actual chip bigger? I mean i wouldn't mind a slightly larger laptop if it had double the processing power by just adding a second microprocessor, thus doubling the amount of transistors while keeping the sames size. What is wrong with my assumption? Obviously it would have been done by now if it would work.
     
  2. jcsd
  3. Mar 7, 2013 #2

    nsaspook

    User Avatar
    Science Advisor

    It was expected that transistors would not scale to just a few atoms in the 1990's but so far we've been able shrink some structures that small. The current chip sizes are driven by money, when you spend 5 billion dollars on a fab and equipment getting the most from each wafer is the primary goal and the way you do that is to make each chip smaller (and maintain yield) because the price of processing the whole wafer is about the same for 200 or a 1000 die on a wafer.
    http://spectrum.ieee.org/semiconductors/design/the-highk-solution
    http://spectrum.ieee.org/semiconductors/nanotechnology/ohms-law-survives-at-the-atomic-scale
     
  4. Mar 7, 2013 #3

    SteamKing

    User Avatar
    Staff Emeritus
    Science Advisor
    Homework Helper

    If you add a processor, you either have to use a bigger battery in your laptop or suffer reduced time between charges. You also add more heat which must be dissipated in some manner. Although you theoretically have doubled your processing power, your software must be re-written to work in two processors simultaneously (if this can be done at all).

    Miniaturization of electronic components is not just a technical exercise, the benefits of the process result in products which are lighter, cheaper, and faster.
     
  5. Mar 7, 2013 #4
    Ok if we don't take the power wasted in heat loss in count basically doubling the transistors/the processing power in a single chip or splitting the chips in half wouldn't make that much of a difference in terms of power consumption as only due to heat.
    Anything other than that you don't get the so called "free lunch" just because you squeeze the whole "transistor kindergarden" under one roof.
    And pretty much I think it is the case as when mobile phones started to come out with those shiny screens and many functions some years ago they were draining the battery way faster, than the "stone age" first nokias years ago.I think the bright colorful screen and the larger processor contributed to that mainly.
     
  6. Mar 7, 2013 #5

    Vanadium 50

    User Avatar
    Staff Emeritus
    Science Advisor
    Education Advisor

    It does work. It also makes a chip twice as expensive when you double up that way.
     
  7. Mar 7, 2013 #6

    rcgldr

    User Avatar
    Homework Helper

    The voltage used for the smaller transitors can be reduced to eliminate the heat problem, but then the switching speed is also reduced. In order to maintain fast switching speeds, the ratio of voltage versus transistor size needs to be kept relatively high, but then heat becomes an issue, especially if the ratio of transistor count per unit area on a chip is high, which is why the clock limit on most air cooled consumer processors has been limited to about 4ghz for almost a decade. By reducing transitor density (larger chips), there's more cooling surface per transitor, but this isn't cost effective and only used on special high performance chips. The other alternative is using a better cooling method, such as liquid cooling.
     
  8. Mar 7, 2013 #7

    sophiecentaur

    User Avatar
    Science Advisor
    Gold Member

    Moore's law may become almost irrelevant with the introduction of cloud computing and (genuine) broadband data connections. Individual processors may only be needed to 'supervise' the machine / human interface.
    Moore's Law doesn't limit its prediction to Silicon technology, either. Quantum computing could be a serious gear change in speed (although the software would need to be re-jigged a bit to make it usable.
     
  9. Mar 7, 2013 #8

    nsaspook

    User Avatar
    Science Advisor

  10. Mar 7, 2013 #9
    I always thought the speed of light created an upper limit on the clock speeds, regardless of temperature. That's why even liquid nitrogen/helium cooled processors still only reach max speeds of about 7GHz, because beyond that speed even huge voltages aren't enough to switch from 0 to 1 in time.

    Maybe that's all wrong?!
     
  11. Mar 7, 2013 #10

    nsaspook

    User Avatar
    Science Advisor

    We're at 8Ghz+ already: http://hothardware.com/News/AMD-Breaks-Frequency-Record-with-Upcoming-FX-Processor/
     
  12. Mar 7, 2013 #11

    rcgldr

    User Avatar
    Homework Helper

    Note that Moore's original law is about transistor count per chip, not density of transitors or speed, and he only stated that it would last at least 10 years (from 1965 to 1975+). Since chip sizes haven't increased much in the last few years, it has evolved in being related to chip density.

    http://en.wikipedia.org/wiki/Moore's_law
     
  13. Mar 8, 2013 #12
    What's wrong with this assumption is that you're looking only a year or two back. Maybe your laptop (processor) would only be 2x larger...but if you look back 10 years, your laptop processor would be 100x larger. Look back another 10 years, your laptop (just the processor) would be 100000x larger. Except it wouldn't, because it couldn't work.

    Also we're talking about a laptop here. How about cell phones? How about the myriad of other gadgets that just wouldn't be possible without miniaturization...and another myriad which we can't even think of today, which won't be possible without even further miniaturization.

    Of course, as others have mentioned, the more you can squeeze into each chip the more money you make. When printing these transistors, you are essentially printing money. That's what's driving companies to squeeze more and more into each chip. If they don't, somebody else will. I actually work for the company that makes these "money printers", so you could say my job is to keep Moore's law going!
     
  14. Mar 9, 2013 #13
    The spirit of Moor's law I think is that computational power per unit of mass or per unit of volume will increase exponentially.

    I don't think we're anywhere close to any sort of theoretical maximum yet. I base this on the fact that the human brain is still orders of magnitude better then any man made computer of similar size in terms of what it can process. I also suspect the human brain is far from the theoretical maximum.
     
  15. Mar 9, 2013 #14

    sophiecentaur

    User Avatar
    Science Advisor
    Gold Member

    The human brain is a very 'approximate' calculating machine. It works so differently from any conventional man-made computer that you can't compare performances reliably. Some inspired human chess playing can still flummox even the best computers, when time is limited.
     
  16. Mar 11, 2013 #15

    cjl

    User Avatar

    Really? I thought that computerized chess programs were already to a level where no human can beat the best of them anymore.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Why the laws of physics will eventually decay Moore's Law
  1. Laws of physics (Replies: 7)

  2. Moore's Law (Replies: 2)

Loading...