Register to reply

More than 2GB of VRAM for 1080p gaming?

by Kutt
Tags: 1080p, gaming, vram
Share this thread:
wuliheron
#19
Nov18-12, 03:01 AM
P: 1,967
Quote Quote by Kutt View Post
Are there ways to dramatically increase the BUS speeds in motherboards?
There are a number of ways, but the semiconductor industry is very brute force oriented. Companies like Intel might have advanced technology they sit on for years while they figure out the cheapest way to implement it. That's just the way it goes when you're talking about companies with multiple factories costing billions of dollars each just to build. Money is what drives progress because you can easily lose a fortune if you don't keep your eye on the bottom line.

The general trend has been to eliminate the mobo altogether whenever possible. That's what the entire history of integrated circuits is all about is eliminating the mobo when ever possible and just putting everything on the chip. As a result we now have cellphones with the power of what would have been a supercomputer 20 years ago. The latest trend is toward chip stacking where you can stack chips side by side on the same piece of silicon and/or right on top of each other.
Kutt
#20
Nov18-12, 02:27 PM
P: 236
Quote Quote by wuliheron View Post
There are a number of ways, but the semiconductor industry is very brute force oriented. Companies like Intel might have advanced technology they sit on for years while they figure out the cheapest way to implement it. That's just the way it goes when you're talking about companies with multiple factories costing billions of dollars each just to build. Money is what drives progress because you can easily lose a fortune if you don't keep your eye on the bottom line.

The general trend has been to eliminate the mobo altogether whenever possible. That's what the entire history of integrated circuits is all about is eliminating the mobo when ever possible and just putting everything on the chip. As a result we now have cellphones with the power of what would have been a supercomputer 20 years ago. The latest trend is toward chip stacking where you can stack chips side by side on the same piece of silicon and/or right on top of each other.
With the scientific limit of Moore's law fast approaching, what will chip makers like Intel and AMD do to further increase the performance and efficiency of their products without reducing the size of the transistors? 22nm is pushing it as far as Moore's law goes.

I'd like to see what AMD/Nvidia will do with 22nm GPU's for the time being. The next-gen HD 8xxx and GTX 7xx series are still going to be 32nm from what I've read.
wuliheron
#21
Nov18-12, 03:07 PM
P: 1,967
Quote Quote by Kutt View Post
With the scientific limit of Moore's law fast approaching, what will chip makers like Intel and AMD do to further increase the performance and efficiency of their products without reducing the size of the transistors? 22nm is pushing it as far as Moore's law goes.

I'd like to see what AMD/Nvidia will do with 22nm GPU's for the time being. The next-gen HD 8xxx and GTX 7xx series are still going to be 32nm from what I've read.
Around 1nm is the molecular scale and there are carbon nanotubes that function extremely well at those sizes, however, Moore's law was replaced already with Koomey's law. Devices are getting so small that energy efficiency and reducing waste heat have become bigger issues. With advances like chip stacking and 3D circuitry you could theoretically stack a hundred chips right on top of each other and fit the equivalent of a basketball court sized supercomputer in a walnut, but only if you can overcome the physical limitations such as waste heat. Spintronics and quantum computing offer the best possible energy efficiency know to date with quantum computers theoretically even capable of absorbing their own excess heat and using it to power themselves and being compatible with spintronics.

Another trend is towards reconfigurable computing. Instead of shrinking parts or packing more chips closer together these use fewer parts to do the same amount of work. For example, IBM's neuromorphic 4nm memristor chip can literally turn a transistor into memory and vice versa on the fly as needed. Their goal in ten years is to put nothing less than the equivalent of the neurons of a cat or human brain on a single chip which gives you some idea of how compact such circuitry can be.
Kutt
#22
Nov18-12, 09:27 PM
P: 236
Quote Quote by wuliheron View Post
Around 1nm is the molecular scale and there are carbon nanotubes that function extremely well at those sizes, however, Moore's law was replaced already with Koomey's law. Devices are getting so small that energy efficiency and reducing waste heat have become bigger issues. With advances like chip stacking and 3D circuitry you could theoretically stack a hundred chips right on top of each other and fit the equivalent of a basketball court sized supercomputer in a walnut, but only if you can overcome the physical limitations such as waste heat. Spintronics and quantum computing offer the best possible energy efficiency know to date with quantum computers theoretically even capable of absorbing their own excess heat and using it to power themselves and being compatible with spintronics.

Another trend is towards reconfigurable computing. Instead of shrinking parts or packing more chips closer together these use fewer parts to do the same amount of work. For example, IBM's neuromorphic 4nm memristor chip can literally turn a transistor into memory and vice versa on the fly as needed. Their goal in ten years is to put nothing less than the equivalent of the neurons of a cat or human brain on a single chip which gives you some idea of how compact such circuitry can be.
These chips sound incomprehensibly complex and even more difficult to write programs for. The human brain is the most complex known object in the universe, I can't imagine a computer chip exceeding that kind of complexity.

I remember reading somewhere something about "optical transistors" which use tiny bursts of light (photons) to turn transistors on and off instead of electrons. Theoretically, these chips could run at speeds of terahertz and produce very little heat.
wuliheron
#23
Nov19-12, 12:32 AM
P: 1,967
Quote Quote by Kutt View Post
These chips sound incomprehensibly complex and even more difficult to write programs for. The human brain is the most complex known object in the universe, I can't imagine a computer chip exceeding that kind of complexity.

I remember reading somewhere something about "optical transistors" which use tiny bursts of light (photons) to turn transistors on and off instead of electrons. Theoretically, these chips could run at speeds of terahertz and produce very little heat.
A foolish consistency is the hobgoblin of little minds. RW Emerson

The complexity of modern technology would be considered unthinkable just a few centuries ago. Already there is an attempt to create the first super Von Neumann architecture and the neuromorphic IBM chip I mentioned. These are disruptive technologies with potentials nobody can predict. A full scale quantum computer of 128 qubits could shake the very foundations of the sciences themselves. All I can say is be prepared to be amazed because this roller coaster ride only gets faster and more interesting from this point on.
AlephZero
#24
Nov19-12, 08:59 AM
Engineering
Sci Advisor
HW Helper
Thanks
P: 7,121
Quote Quote by wuliheron View Post
These are disruptive technologies with potentials nobody can predict. A full scale quantum computer of 128 qubits could shake the very foundations of the sciences themselves. All I can say is be prepared to be amazed because this roller coaster ride only gets faster and more interesting from this point on.
Don't worry about what to do with all that computing power. Somebody will invent an even more bloated user interface that needs 128 qbits just to display a 3D holographic desktop ...
wuliheron
#25
Nov19-12, 10:06 AM
P: 1,967
Quote Quote by AlephZero View Post
Don't worry about what to do with all that computing power. Somebody will invent an even more bloated user interface that needs 128 qbits just to display a 3D holographic desktop ...
LOL, it's the drivers I'm worried about.
Kutt
#26
Nov19-12, 03:01 PM
P: 236
Quote Quote by wuliheron View Post
LOL, it's the drivers I'm worried about.
Yeah, as if drivers weren't already frustratingly hard enough to program.
mishrashubham
#27
Nov21-12, 07:42 AM
P: 605
Quote Quote by AlephZero View Post
Don't worry about what to do with all that computing power. Somebody will invent an even more bloated user interface that needs 128 qbits just to display a 3D holographic desktop ...
Compiz 4D lol


Register to reply

Related Discussions
GPU, max resolution 2560 x 1600 but no 1080p support Computers 1
Gaming I would like to have some Games..coded in C++ Computing & Technology 2
RISC vs x86 gaming Computing & Technology 2
PF Gaming General Discussion 68
Gaming, why people use 3rd party drivers? Computing & Technology 10