Computers, computation and heat

In summary, the computer forum is more advanced than the brain forum, but Blue Gene falls short of the speed attributed to three pounds of goo in a person's head. The difference in power consumption is attributed to the heat involved in a computer.
  • #1
Tiiba
54
0
There is a computer forum and a brain forum. I don't know which to choose. But I'll go here.

Computers made more progress in the last ten years than cars did in the last hundred. But Blue Gene still falls short of the speed attributed to three pounds of goo in my head (20 quadrillion ops per second, according to Ray Kurzweil). As I understand, the reason is heat - if all the computing power of Blue Gene was packed into three pounds, it would melt. But the brain does all its work on 20 watts - one per petaflop per second.

Why is there such a big difference in power consumption? Lots of people have imitated neural networks, but have there been any attempts to build an artificial neural network that uses energy like the real one? An attempt to find out got me to a discussion of why neurons use MORE energy than they should...
 
Computer science news on Phys.org
  • #2
You are comparing apples, oranges, and bananas. A computer is not a neural net and neither of these are brains. The human brain has an overall architecture that is much more complex than computers or mere neural nets. Until we truly understand this architecture it is impossible to say with certainty whether or not an electronic device will ever be able to compete with the human brain for size, power, etc.

However, heat is probably not going to be a big factor in the long run. For example, quantum neural nets are projected to have abilities including speed that classical devices cannot obtain, and such devices thus far tend to work best at close to absolute zero.

Some people such as Rodger Penrose believe the human brain may actually work on the quantum level as well, but thus far there is no evidence to support this theory. In any case, it may be that we can never create a classical device that competes with the brain for power, size, etc, but it is almost certain that quantum devices will possesses capabilities that the human brain cannot duplicate. Quantum computers work factorially, that is, for each new transistor or neuron added their computational power increases by a factor and it takes longer to input the question than it does to receive the answer.
 
  • #3
"A computer is not a neural net"

Let me use this opportunity to ask: can you believe that there is actually a living, breathing human being named Rajagopal Ananthanarayanan?

Not entirely relevant. But, on the other hand, Neruogrid is both a computer and a neural network.

"and neither of these are brains"

What do you mean?

"However, heat is probably not going to be a big factor in the long run."

I just want to know how they (or I) do it. Why does a natural neural network need more energy than its artificial implementation, including, as I understand, hardware implementations, and what are people doing about it? Chips tend to get more efficient as their components get smaller, but neurons are much bigger than 65 nm.

"In any case, it may be that we can never create a classical device that competes with the brain for power, size, etc"

I once had an opportunity to argue about this. The debate made me more certain than I started out (the usual effect) that, if humans aren't destroyed first, they will certainly be able to make machines smaller, faster and smarter than brains, and probably more so.
 
Last edited:
  • #4
Computers work using logic gates and boolean logic, not neural nets. It is like comparing a phonograph to a cd player. Of course, the inverse can be true, a neural net can be a computer, but the two are not synonymous. For example, a human being is not a computer and vice versa.

Although there is a tendency among some people to equate animal brains with computers and minds with computer programs there are critical distinctions. Thus far computers can only imitate emotions and sentience.

Neurons only account for roughly half of the human brain. The other half is composed of axions whose function is poorly understood. In addition, the animal brain is integral to the body and shares in its biochemistry and genetic makeup. It is powered by chemical reactions rather than electricity.

As for hardware getting more efficient as it gets smaller, that has only been true for the last forty years. Animal brains have been around for a great deal longer than that.
 
  • #5
The other half is composed of axions whose function is poorly understood.

I know another half called glia cells but axions?
 
  • #6
The biggest reason is that the brain is primarily a chemical computer, whereas man-made computers use electricity. Yes, nerve conduction can be said to use "electrical signals," but action potentials are created by ion transport, not by actual currents of electrons as in wires. The nervous system is essentially entirely chemical.

Consider the energy used by a synapse: a connection between two neurons. To stimulate its neighbor, a neuron only needs to secrete a small number of protein molecules into the synapse. The energy consumed by the active transport system in pushing around a tiny quantity of chemical is positively tiny. The neuron can change state by simply pulling these molecules back inside itself, also with only a very small cost in energy. A very small quantity of neurotransmitter chemicals can simply be recycled, over and over, by the neuron.

In comparison, whenever modern transistors switch states, they conduct current from their power supplies. The faster you wish them to transition between states, the more current they must conduct. When one logic gate signals its neighbor, it must dump charge from the power supply into the input capacitance of its neighbor. When it later changes state, that charge is not recycled; it is passed on to the negative power supply and lost forever. Every time the gate changes state, another packet of charge is lost. This is, of course, a stark contrast to the brain, which uses its neurotransmitters over and over, and only uses tiny amounts of energy to move them around.

So, why don't we develop chemical computers? We currently lack the technological sophistication to engineer small-scale, reliable chemical computers that mimick the brain's signalling. If or when we do, we might give up on electrical computers altogether. It'll be a long time, though!

Another problem with our current computers is that they just aren't as specialized or as efficient at solving problems as our brains are. Our brains can do more with several thousand neurons than we can do with a Pentium chip. This is not because we don't know how to build computer which are fast enough -- it's that we don't understand problems well enough yet to design solutions similar to those used by the brain. Of course, the brain is also fallible and easily confused, so even if we make an "artificial brain" one day, it might be a pretty poor choice for running your local bank.

- Warren
 
Last edited:
  • #7
somasimple said:
I know another half called glia cells but axions?

Sorry, sometimes I get confused about all these terms. Thanks for the correction.
 
  • #8
As I understand, the difference is that neurons indicate their state using chemicals, and use energy only to move those chemicals around; while the state of a transistor is both controlled and represented by energy. Right?
 

What is the relationship between computers, computation and heat?

Computers generate heat while performing computations. The amount of heat produced is directly related to the complexity and intensity of the computations being carried out.

How does heat affect the performance and lifespan of a computer?

Excessive heat can cause a computer to slow down and even crash. It can also damage components, reducing the lifespan of the computer. Proper cooling and ventilation is important for maintaining optimal performance and prolonging the life of a computer.

Why do computers need to be cooled?

Computers need to be cooled because the components inside generate heat while processing data. If the heat is not dissipated, it can cause damage to the components and affect the performance of the computer.

What are the most common cooling methods used for computers?

The most common cooling methods for computers include air cooling, liquid cooling, and heat sinks. Air cooling uses fans to circulate air and dissipate heat, while liquid cooling uses a liquid coolant to remove heat. Heat sinks are passive cooling systems that absorb and dissipate heat from the components.

How can I prevent my computer from overheating?

To prevent your computer from overheating, make sure it is placed in a well-ventilated area and not surrounded by objects that could block airflow. Keep the internal components clean and dust-free. You can also consider using additional cooling methods, such as a cooling pad or liquid cooling system.

Similar threads

  • Programming and Computer Science
Replies
29
Views
3K
  • Topology and Analysis
Replies
2
Views
3K
Replies
10
Views
2K
  • Computing and Technology
Replies
5
Views
14K
  • Engineering and Comp Sci Homework Help
Replies
3
Views
2K
  • Programming and Computer Science
Replies
4
Views
4K
  • General Discussion
Replies
1
Views
1K
  • General Discussion
2
Replies
48
Views
191K
  • Programming and Computer Science
Replies
1
Views
2K
Back
Top