Next Generation Computer Hardware Coming Soon

AI Thread Summary
The development of a new generation of computers is rapidly advancing, characterized by the emergence of "Frankenstein chips." These chips integrate multiple components stacked or arranged closely together to reduce power consumption and latency. Current technologies, such as stacked chips in servers and smartphones, demonstrate the feasibility of this approach. Intel's hybrid memory cubes, which offer high transfer speeds and reduced space requirements, are paving the way for replacing traditional RAM with more efficient memory solutions.The upcoming Frankenstein chips will feature heterogeneous architectures, combining various types of processors to enhance processing capabilities for tasks like AI and graphics. This evolution will likely improve programming ease through hardware-accelerated transactional memory. The increasing demand for bandwidth, driven by ultra-high-definition displays, is influencing the design of future computers, which will blur the lines between individual components as they work collaboratively to optimize performance. Despite initial predictions, OLED technology has not dominated the market, leaving questions about the next advancements in computer hardware.
wuliheron
Messages
2,150
Reaction score
0
All the pieces are falling into place rapidly for the development a new generation of computers that only vaguely resemble the ones we have today. The single biggest innovation to look forward to in next generation computer hardware is what I like to call Frankenstein chips. Just as shrinking components on a chip reduces latencies you can stack one chip right on top of another and/or side by side on a silicon "transposer" to create one massive chip sewn together from parts like Frankenstein's monster. This way the parts are closer together for lower power requirements and reduced latencies. With each chip being millimeters thick and the size of a finger nail the sky is the limit if you can deal with heat and other technical issues economically. The most frustrating thing for me about this technology is that it is completely unpredictable and the expensive Frankenstein chip you buy today could become obsolete overnight.

Stacking may sound like pie in the sky to some, but servers and smart phones already use stacked chips, HP has offered to put 2gb of their memristors on top of any existing chip, and stacks of up to 8 conventional ram memory chips can be made. The only remaining issue is cost and last year Intel demonstrated their new hybrid memory cubes with 1Tb transfer speeds and pictures of their upcoming Haswell chip indicate it was designed to use a transposer. Already a consortium of all the major manufacturers has formed to establish a new standard for hybrid memory cubes so they can replace traditional ram sticks as soon as possible. Using 70% less space and a 7 fold decrease in power they are ideal for portable applications. Because each contains its own controller chip for input and output the memory chips themselves can eventually be replaced with anything including nonvolatile phase change memory.

All the evidence indicates we're about to get slammed with a variety of Frankenstein chips and, as if that were not confusing enough, some of the individual chips will have heterogeneous architectures containing multiple different types of processors. As best I can determine 8 cpu processors is an ideal minimum for processing full blown matrices and roughly 80-300 simplified GPU processors are ideal for anything from transcoding to physics to AI. For a desktop PC that would mean more of the load normally placed on the video card today can be done on the APU and transferred over the currently underutilized PCI-e 3.0 bus. Exactly how these heterogeneous architectures will evolve is anyone's guess but, thankfully, they will incorporate hardware accelerated transactional memory making them easier to program.

The only way I know to evaluate the power of such monstrosities is by measuring raw bandwidth capacity and, apparently, a lot of that bandwidth will soon be taken up by ultra high definition screens. OLEDs continue to slowly come on the market, but LCD manufacturers now have a way to produce ultra high definition screens almost as cheaply as the current high definition LCD ones and are retooling their assembly lines as quickly as possible. To leverage the available bandwidth even better the first video cards capable of using system ram as well as vram for displaying graphics are already on the market perhaps indicating the shape of things to come. That is, computers where the distinctions between the individual components become increasingly blurred as the emphasis shifts to maximizing overall bandwidth potential by designing all the components to be flexible enough to assist each other in almost any task.
 
Computer science news on Phys.org
7 years later, what is the state of these predictions? OLED for sure not dominates the market. What is next up?
 
I came across a video regarding the use of AI/ML to work through complex datasets to determine complicated protein structures. It is a promising and beneficial use of AI/ML. AlphaFold - The Most Useful Thing AI Has Ever Done https://www.ebi.ac.uk/training/online/courses/alphafold/an-introductory-guide-to-its-strengths-and-limitations/what-is-alphafold/ https://en.wikipedia.org/wiki/AlphaFold https://deepmind.google/about/ Edit/update: The AlphaFold article in Nature John Jumper...
Thread 'Urgent: Physically repair - or bypass - power button on Asus laptop'
Asus Vivobook S14 flip. The power button is wrecked. Unable to turn it on AT ALL. We can get into how and why it got wrecked later, but suffice to say a kitchen knife was involved: These buttons do want to NOT come off, not like other lappies, where they can snap in and out. And they sure don't go back on. So, in the absence of a longer-term solution that might involve a replacement, is there any way I can activate the power button, like with a paperclip or wire or something? It looks...
Back
Top