What is the likely limit of processing speed?

AI Thread Summary
The discussion focuses on the potential limits of processor speed, emphasizing that physical constraints may hinder advancements beyond current technologies. Key points include the significance of data connection bandwidth within processors and the limitations imposed by signal propagation speed. The conversation highlights that modern processors have shifted towards multicore architectures to enhance performance, as increasing clock rates face challenges related to heat dissipation. Additionally, there are considerations regarding the future of computing technologies, such as quantum computing and biocomputing, which may offer alternative pathways. Ultimately, the consensus suggests that while there are existing limits, breakthroughs in technology could redefine these boundaries in the coming decades.
dragoneyes001
Messages
216
Reaction score
23
the other thread about computer sciences got me wondering just how fast or how many millions or billions of tasks are likely to be the physical limits to processors? I'm not asking how small things will get but where if any is the brick wall to processing. to further limit the scope this is about a processors limit not in tandem.

does anyone know if there is a limit?
 
  • Like
Likes Silicon Waffle
Technology news on Phys.org
If you are discounting parallel processing and also any limit imposed by how small components can be made,
then I guess the next possible limit to confront would be data connection bandwidth.
I mean by this, how many bits of information can be exchanged between components as one operation.
At the moment we can get what we consider to be quite good bandwidth using optical fiber and visible light frequencies.
In principle I suppose the possible bandwidth of data connections can be increased by using UV or even X-rays and gamma rays,
but I think there would be a limit beyond which materials cannot be pushed, or be expected to operate in the presence of ionizing radiation.
 
dragoneyes001 said:
I'm not asking how small things will get but where if any is the brick wall to processing

Well, I think that is the brick. What are we down to now? 22nm technology? I haven't followed the progress recently, as in the last few years, but the "chatter" around the industry at that time was that there may be physical limits to getting much smaller than that. This is the big reason we've seen the advent of mutlicore CPU technology over the past decade as well as what were traditionally the shaders in video cards being converted into parallel processing units, such as NVIDIA's CUDA architecture. Add to that the push to incorporate more cache RAM onto the CPU itself. This is also why you are seeing more and more articles appearing in the literature relating to "biocomputing" and nanoarchitectures. But these are as of yet unproven alternatives to the core silicon architecture.

rootone said:
At the moment we can get what we consider to be quite good bandwidth using optical fiber and visible light frequencies.
In principle I suppose the possible bandwidth of data connections can be increased by using UV or even X-rays and gamma rays,
but I think there would be a limit beyond which materials cannot be pushed, or be expected to operate in the presence of ionizing radiation.

The OP is talking about processors, not processes. So fiber optic bandwidth is not a consideration here. Hence my comment about the push to put more cache on the CPU itself, not to mention the push to put the graphics processing capacity on the CPU as well.
 
So far nobody has mentioned the fact we are limited by the signal propagation speed. Say if the memory is 1 foot from the processor, they need 1 ns to exchange the information. Sure, inside the processor distances are several orders of magnitude lower, but it still means there is a physical limit.
 
Borek said:
So far nobody has mentioned the fact we are limited by the signal propagation speed. Say if the memory is 1 foot from the processor, they need 1 ns to exchange the information. Sure, inside the processor distances are several orders of magnitude lower, but it still means there is a physical limit.
That's sort of what I was getting at when mentioning bandwidth - not bandwidth in the internet connection speed sense, but bandwidth in terms of how much data and how quickly can it be shoved around within the internals of the processor.
I guess I mean the maximum data transmission capacity of what is commonly called the buss subsystem, (mostly 64 bits for regular PC at the moment)
I imagine that once you try to get that above say 1k bits you will start to get diminishing amounts of improvement by adding more bits.
 
Last edited:
I think you can only set a limit on the current technology. You would have to rule out breakthroughs from quantum computers, superconductors, optical computing, etc. That is a lot to rule out over the next 30-40 years.
 
  • Like
Likes phinds
For a relatively low cost consumer based processor, the limit for clock rates for air cooled processors is around 4 ghz, and it's been that way for almost a decade. The issue is heat dissipation with high density chips. Instead of increasing clock rates, modern consumer processors have more cores (4 or 6), and perform more operations in parallel. Liquid cooling increases this rate, but again there's a limit. There are liquid nitrogen cooled chip sets running at 6.5 ghz, and water cooled chipsets running around 5 ghz. Again density is an issue even with liquid cooling, because there's not much heat conducting material between the components of a processor chip.
 
Here is a unique view, from Seth Lloyd, a quantum mechanic at MIT

https://edge.org/conversation/the-computational-universe

He discusses it in more detail here:
http://arxiv.org/abs/quant-ph/9908043

In summary:
The 'ultimate laptop' is a computer with a mass of 1 kg and a volume of 1 l, operating at the fundamental limits of speed and memory capacity fixed by physics. The ultimate laptop performs 2mc^2/[URL]http://www.nature.com/__chars/pi/black/med/base/glyph.gifhttp://www.nature.com/__chars/planck/black/med/base/glyph.gif = 5.4258 [URL]http://www.nature.com/__chars/math/special/times/black/med/base/glyph.gif 10^50 logical operations per second on
glyph.gif
10^ 31 bits. Although its computational machinery is in fact in a highly specified physical state with zero entropy, while it performs a computation that uses all its resources of energy and memory space it appears to an outside observer to be in a thermal state at
glyph.gif
10^9 degrees Kelvin. The ultimate laptop looks like a small piece of the Big Bang.
 
Last edited by a moderator:
  • #10
how is heat a potential issue in quantum computers. since atoms become more active at higher temperatures would that actually be a limiting factor?
 
  • #11
Heat is random motion and random state changes. A quantum computer depends on the state of its quantum bits being controlled by the logic that you are trying to get it to solve. So throwing in random behavior can ruin the whole thing.
 
Back
Top