Biggest Change to Computer Chips In 40 Years

In summary: So now that it is slowing down, people are predicting that it will fail soon.Intel is using a new transistor material that will allow them to continue delivering record-breaking PC, laptop and server processor speeds while reducing the amount of electrical leakage from transistors that can hamper chip and PC design, size, power consumption, noise and costs. This new transistor breakthrough also ensures that Moore's Law, a high-tech industry axiom that transistor counts double about every two years to deliver ever more functionality at exponentially decreasing cost, thrives well into the next decade.
  • #1
Ivan Seeking
Staff Emeritus
Science Advisor
Gold Member
8,142
1,756
...More Performance for Exponentially Less Cost

In one of the biggest advancements in fundamental transistor design, Intel will use dramatically different transistor materials to build the hundreds of millions of microscopic 45 nanometer (nm) transistors inside the next generation of the company's Intel® Core™2 family of processors. Intel already has the world's first 45nm CPUs in-house - the first of at least fifteen 45nm processor products in development. This new transistor breakthrough will allow Intel to continue delivering record-breaking PC, laptop and server processor speeds while reducing the amount of electrical leakage from transistors that can hamper chip and PC design, size, power consumption, noise and costs. It also ensures that Moore's Law, a high-tech industry axiom that transistor counts double about every two years to deliver ever more functionality at exponentially decreasing cost, thrives well into the next decade.[continued]
http://www.intel.com/technology/silicon/45nm_technology.htm
http://www.intel.com/pressroom/archive/releases/20070128comp.htm

According to one news report, they are talking about 1 teraflop CPUs in five years.
 
Last edited by a moderator:
Computer science news on Phys.org
  • #2
Yeesh. My 2.4ghz P4 seems fast enough to do anything I need it to do already.

You know what I've noticed as computers have gotten faster? They've slowed down. My P4 with WinXP takes more time to boot than my 500mhz AMD K6-2 computer with Windows 98 on it, which takes about a minute and a half. And for even more of a comparison, I once had the delight of booting a 66mhz laptop with 16mb of RAM and Win95 running on it. It took less than 20 seconds.

It seems that as computers get faster and faster, programmers create more and more monstrous programs that are exponentially more wasteful with system resources.

Don't get me wrong though, these technological breakthroughs are quite impressive.

One thing I don't get though.. The Playstation 3 has a 2 TFlops performance. Why don't operating systems start supporting the same 128-bit processors as the gaming systems? I wouldn't mind having a 128-bit, modified PowerPC processor with 8 extra 3.2 Ghz RISC processor cores running alongside the main 3.2 ghz 128-bit processor that has extra L2 cache..
 
Last edited:
  • #3
In my opinion, while there is still room for improvement as far as speed and multitasking goes, the improvements in CPU speed will be used to increase software reliability and security as well as ease of programming.

For example, in Java, at the cost of some performance (which is negligeable with today's machines), we have automatic Memory Management, Object Oriented standards (which simplify software programming) in addition to cross-platform compatibility.

Features like these cause overhead, but enable us to make good software quicker. So i think that as hardware gets faster we'll start to see programming languages get simpler and their respective frameworks more reliable.
 
Last edited:
  • #4
I am excited by the breakthrough, but moore's law is done. The sneaky move is to change moore's law from talking about processors to talking about "chips", and then allowing multiple processors on a single chip.
 
  • #5
I don't believe in Moore's law.. It's more of a theory than anything.
 
  • #6
I don't believe in Moore's law.. It's more of a theory than anything.

Its not even a theory, it is a rough approximation to a trend that is most notable for how long it lasted (usually exponential growth in the real world runs into problems more quickly). The whole point is that people kept expecting it to fail (i.e. no one believed it as a law) and were consistently surprised that it didn't.
 

1. How will this change affect the speed and performance of computers?

The biggest change to computer chips in 40 years is the transition from traditional silicon-based chips to chips made with a new material called gallium nitride. This change is expected to greatly increase the speed and performance of computers, as gallium nitride has higher electron mobility and can operate at higher frequencies than silicon.

2. Will this change make computers more energy efficient?

Yes, the switch to gallium nitride chips is also expected to make computers more energy efficient. This is because gallium nitride has a lower resistance and can handle higher voltages, allowing for more power-efficient operation. This means that devices using these chips will require less energy to function, leading to longer battery life and reduced electricity consumption.

3. How will this change impact the size and form factor of computers?

Gallium nitride chips are much smaller than traditional silicon chips, meaning that they can be used to create smaller and more compact devices. This will likely lead to thinner and lighter laptops, smartphones, and other electronic devices. It may also open up possibilities for new types of devices, such as wearable technology or flexible screens.

4. What are the potential drawbacks of this change?

While there are many potential benefits to switching to gallium nitride chips, there are also some potential drawbacks. One concern is the cost of production, as gallium nitride is a relatively new and expensive material. Another concern is the potential for compatibility issues with existing software and hardware, which may need to be updated or redesigned to work with the new chips.

5. When can we expect to see devices using these new chips on the market?

The transition to gallium nitride chips is already underway, with some companies starting to incorporate them into their products. However, it may still take a few years before they become widely available in consumer devices. As with any major change in technology, there will likely be a period of transition and development before these chips become the norm in the industry.

Similar threads

  • Computing and Technology
Replies
12
Views
2K
  • Computing and Technology
Replies
1
Views
4K
  • Computing and Technology
Replies
3
Views
17K
Replies
8
Views
5K
  • Electrical Engineering
Replies
2
Views
2K
  • General Discussion
Replies
2
Views
3K
Back
Top