Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Biggest Change to Computer Chips In 40 Years

  1. May 9, 2007 #1

    Ivan Seeking

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    http://www.intel.com/technology/silicon/45nm_technology.htm
    http://www.intel.com/pressroom/archive/releases/20070128comp.htm

    According to one news report, they are talking about 1 teraflop CPUs in five years.
     
  2. jcsd
  3. May 9, 2007 #2
    Yeesh. My 2.4ghz P4 seems fast enough to do anything I need it to do already.

    You know what I've noticed as computers have gotten faster? They've slowed down. My P4 with WinXP takes more time to boot than my 500mhz AMD K6-2 computer with Windows 98 on it, which takes about a minute and a half. And for even more of a comparison, I once had the delight of booting a 66mhz laptop with 16mb of RAM and Win95 running on it. It took less than 20 seconds.

    It seems that as computers get faster and faster, programmers create more and more monstrous programs that are exponentially more wasteful with system resources.

    Don't get me wrong though, these technological breakthroughs are quite impressive.

    One thing I don't get though.. The Playstation 3 has a 2 TFlops performance. Why don't operating systems start supporting the same 128-bit processors as the gaming systems? I wouldn't mind having a 128-bit, modified PowerPC processor with 8 extra 3.2 Ghz RISC processor cores running alongside the main 3.2 ghz 128-bit processor that has extra L2 cache..
     
    Last edited: May 9, 2007
  4. May 9, 2007 #3

    -Job-

    User Avatar
    Science Advisor

    In my opinion, while there is still room for improvement as far as speed and multitasking goes, the improvements in CPU speed will be used to increase software reliability and security as well as ease of programming.

    For example, in Java, at the cost of some performance (which is negligeable with today's machines), we have automatic Memory Management, Object Oriented standards (which simplify software programming) in addition to cross-platform compatibility.

    Features like these cause overhead, but enable us to make good software quicker. So i think that as hardware gets faster we'll start to see programming languages get simpler and their respective frameworks more reliable.
     
    Last edited: May 9, 2007
  5. May 11, 2007 #4
    I am excited by the breakthrough, but moore's law is done. The sneaky move is to change moore's law from talking about processors to talking about "chips", and then allowing multiple processors on a single chip.
     
  6. May 11, 2007 #5
    I don't believe in Moore's law.. It's more of a theory than anything.
     
  7. May 11, 2007 #6
    Its not even a theory, it is a rough approximation to a trend that is most notable for how long it lasted (usually exponential growth in the real world runs in to problems more quickly). The whole point is that people kept expecting it to fail (i.e. no one believed it as a law) and were consistently surprised that it didn't.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?



Similar Discussions: Biggest Change to Computer Chips In 40 Years
Loading...