Efficiency of computers vs. brains

In summary, the fundamental reason behind why modern electronic computers are faster than humans at doing basic arithmetic is because computers are explicitly designed for this task while human brains evolved for other purposes. This is due to the fact that neurons are larger than transistors, allowing for quicker transfer of information in computers. However, this is only a small part of the reason why computers are faster at certain tasks, as human brains are better at processing large amounts of non-specific data. The conductivity of materials does not play a significant role in the speed difference between computers and brains.
  • #1
otownsend
12
0
TL;DR Summary
I'm trying to understand how a brain v.s. a computer computes and why computers are immensely better than humans at certain tasks.
Hi,

I’m an undergraduate student interested in cognitive processes. I’m about to ask a very interdisciplinary question, and so hopefully I can find a physicists that can take on the challenge.

What is the fundamental reason behind why modern electronic computers (transistor computers) are faster than humans at doing basic arithmetic?

While we are still trying to figure out how to compare brains to computers, I think we can all agree that the brain has many computational features to it. The brain transfers ions between neurons and a computer transfers electrons through transistors. So then what makes computers more efficient at processing information? Does it have something to do with the conductivity of materials?
 
Physics news on Phys.org
  • #2
otownsend said:
What is the fundamental reason behind why modern electronic computers (transistor computers) are faster than humans at doing basic arithmetic?

otownsend said:
what makes computers more efficient at processing information?

These are two different questions; "processing information" is much, much more general than "doing basic arithmetic".

To answer the first question, I would say that human brains did not evolve to do basic arithmetic; they evolved to do other things, and doing basic arithmetic is a modern task that we have to repurpose our brains to do. Electronic computers, on the other hand, are explicitly designed to do basic arithmetic, so of course they're going to be faster at it.

I think the second question is based on a false premise: I don't think computers are more efficient at processing all information, only some kinds of information.

otownsend said:
Does it have something to do with the conductivity of materials?

If you are asking why transferring ions between neurons is not as fast as transferring electrons through transistors, I don't think it's a matter of conductivity of the respective materials. I'm not sure neurons are any less conductive than transistor materials. I think it's just that transistors, at least the ones on a typical chip in a modern computer, are much smaller than neurons, so the transfer of charge occurs over a much smaller distance, hence takes a much shorter time.

However, I also think this underlying physics question is only a small part of the reason why computers are faster than humans at the tasks they are faster at. I think a lot of it has to do, as I said above, with the simple fact that computers are designed for particular tasks, whereas we humans have to repurpose for those tasks brains that evolved to do very different tasks.
 
  • Like
Likes Klystron, Demystifier, davenn and 2 others
  • #3
PeterDonis said:
These are two different questions; "processing information" is much, much more general than "doing basic arithmetic".

To answer the first question, I would say that human brains did not evolve to do basic arithmetic; they evolved to do other things, and doing basic arithmetic is a modern task that we have to repurpose our brains to do. Electronic computers, on the other hand, are explicitly designed to do basic arithmetic, so of course they're going to be faster at it.

I think the second question is based on a false premise: I don't think computers are more efficient at processing all information, only some kinds of information.
Agreed; I would say that computers are very good at doing specific and limited and linear tasks like basic arithmetic whereas people are better at processing large amounts of non-specific data imprecisely. Computers can do basic arithmetic millions of times faster than a human, but they have trouble doing the processing necessary to drive a car and utterly fail at walking.
 
  • #4
russ_watters said:
Agreed; I would say that computers are very good at doing specific and limited and linear tasks like basic arithmetic whereas people are better at processing large amounts of non-specific data imprecisely. Computers can do basic arithmetic millions of times faster than a human, but they have trouble doing the processing necessary to drive a car and utterly fail at walking.

Yes, sorry for not distinguishing between different types of information a thing can process. I entirely agree that humans are better at certain tasks over your typical computer (e.g. obviously motor functions, assuming the computer even has arms and legs), and typical computers are better at some other tasks than humans (e.g. arithmetic).

I'm more wondering about the fundamental reason behind why modern day computers are better than humans at the specific task of arithmetic. I am also curious why humans are better than computers at walking and other motor functions, but I don't want to throw you too many questions at once. I like how Peter Donis spoke about how your average neuron in the brain is larger than today's transistors, and so a computer (made mainly of transistors) can deliver information a lot quicker. This is what I'm trying to get at here. However, I've also read some articles saying that you can't compare transistors to neurons, as neurons do way more tasks than just "turing on and off".
 
  • #5
otownsend said:
I like how Peter Donis spoke about how your average neuron in the brain is larger than today's transistors, and so a computer (made of transitors) can deliver information a lot quicker.

This is a general point about speed, but it doesn't explain why computers are much better at some tasks (like basic arithmetic) and much worse at others (like controlling walking). I don't think the explanation for that is to be found in physics; it's more a matter of biology and cognitive science and computer architecture and software. Discussions of all those things probably need to go in separate threads on other specific topics related to them.
 
  • #6
otownsend said:
I like how Peter Donis spoke about how your average neuron in the brain is larger than today's transistors, and so a computer (made mainly of transistors) can deliver information a lot quicker. This is what I'm trying to get at here.
It's not just the distance; the signal propagation speed is also something like a million times faster.
However, I've also read some articles saying that you can't compare transistors to neurons, as neurons do way more tasks than just "turing on and off".
I don't know what exactly they do...
 
  • Like
Likes otownsend
  • #7
The speed of neural transmissions is orders of magnitude slower than even really old motherboards and cpus.
There are single instructions (machine code ) for add, subtract, and divide for integers. For floating point arithmetic operations (called flops, like 4.2221 * 1.992 ) most cpus have some sort of floating point emulation that is exceedingly fast.

Human brains are not optimized in any way to do those kinds of arithmetic operations.

The fastest computers (usually clusters of them) compete in flops (floating point operations) per second. Summit ran at 143.5 petaFLOPS or ##143.5 * 10^{15} = 145300000000000000##

see: https://en.wikipedia.org/wiki/TOP500
 
  • #8
otownsend said:
I entirely agree that humans are better at certain tasks over your typical computer (e.g. obviously motor functions, assuming the computer even has arms and legs)
The best example of this that I've ever run across is that during his heyday, you could show a very simple line drawing of the profile of Bob Hope to Americans who had never even SEEN a line drawing of that type and they would know immediately that it was Bob Hope. A computer can be given a thousand pictures of Bob Hope from all angles, along with pictures of other people and there are no algorithms (well, there were not at the time I read the article which was some years back) that will allow the computer to identify the line drawing as being associated with Bob Hope.
 
  • Like
Likes diogenesNY and russ_watters
  • #9
otownsend said:
Summary: I'm trying to understand how a brain v.s. a computer computes and why computers are immensely better than humans at certain tasks.

What is the fundamental reason behind why modern electronic computers (transistor computers) are faster than humans at doing basic arithmetic?
Arithmetic was not needed for millions of years of brain evolution. Evolution never provides solutions to problems that don't afffect survival. Our brains were not 'designed'.
 
  • Like
Likes Klystron and phinds
  • #10
otownsend said:
I'm more wondering about the fundamental reason behind why modern day computers are better than humans at the specific task of arithmetic.
Because, fundamentally, that's ALL computers know how to do.

otownsend said:
I am also curious why humans are better than computers at walking and other motor functions
Most likely it's because only a small fraction of computer programmers have any training whatsoever in feedback control theory. Those who do, e.g. Boston Dynamics, have demonstrated remarkable progress in robotic mastery of homeostatic motor functions.
 
  • #11
otownsend said:
Summary: I'm trying to understand how a brain v.s. a computer computes and why computers are immensely better than humans at certain tasks.
There are several differences between computers and brains that effect speed and efficiency for different tasks:
In computers memory (of both data and algorithms) and processing is separated, in brains they are more or less combined.
Computers work (basically) serial, brains work parallel.
A neuron works more like a leaky conductor that only fires if it gets a certain amount of input within a certain time, a transistor is like a switch.
The purpose of a brain is to transform vast amounts of input from all senses into very small decisions that improve chances of survival. One of the most important tasks in that context is recognition. The parallel working of the brain is much better suited for that than a serial computer because in recognition different data points may be of totally different importance and need therefor be treated differently. A parallel brain can easily calculate a diffuse bias towards one result or other (dog or cat, friend or stranger, twig or snake). It can easily filter out noise and irrelevant stuff or exagerate tiny clues. And to improve chances of survival further the brain works with a fair amount of redundancy.
But since memory and processing happen more or less in the same area each category to be recognized has its own space. One for faces, one for cars, one for melodies etc. When it comes to algebra the human brain is indeed very fast, too. But only if trained, that is, if a certain area of the brain is dedicated to the task. People trained at the Chinese abacus or some specially talented people can perform huge algebraic calculations very, very fast (I knew an accountant who could basically sum up all numbers on a sheet of paper in the blink of an eye).
Not as fast as a computer, but we must take into account that input and output to and from computer vs human is very different.
A human needs to reed or hear the question, interpret images or sound, transform them into something the brain can process and then re-transform the result into muscle movement of some kind. That takes additional time. The actual processing goes much faster. Since we generally only train a very small section of the brain to add numbers the brain is very slow if it has to do very many very simple tasks (like adding 10000 numbers) that delivers many simple results (thousands of sums that need to be remembered).
On the other hand it is very fast when computing vast, noisy and conflicting input (like a distorted, incomplete image or listening to someone at a noisy party) and deliver one result (“that is a cat” or a witty reply). Ask a computer via sound in a room full of talking people what is one plus one and even with the most advanced voice recognition software and connected to a supercomputer it will fail to answer as fast as a human.
If we ask for the actual max speed of a brain researchers disagree wildly but we can assume the number lies somewhere above 100 Tflops. Due to the separated nature of the brain very, very little of that computational power can be used for algebra, since most of the brain is already dedicated to other tasks like language, driving a car, how to chose and taking on clothes, prepare breakfast, and a thousand other things computers generally do not need to care about. But we can assume that if we grew a brain in a lab, provided some massive parallel input/output device and would train it to only do algebra, nothing else, it would be able to do that very, very, fast too. However, due to the nature of the nerve, the results may not always be precise.
To sum up the answer to the first question: The fundamental differences are dedication of areas, parallel vs serial computing and leaky conductors vs precise switches.
To give an answer to the second question: The human brain is faster in processing the kind of noisy, incomplete information that reality usually provides.
 
  • Like
Likes sophiecentaur
  • #12
@Flisp (#11) is getting to the point. One needs to compare what actually happens when humans do arithmetic and when a computer "does arithmetic".

Typical humans have memorized addition and multiplication tables for single digits, and then have procedures that they follow for arithmetic tasks involving multiple digits. So arithmetic involves acts of recall (the tables) and acts of decision making (what do I do next in the multi-step tasks of advanced arithmetic). These steps in the human process of calculation involve specific operations in specific brain regions, running on brain hardware that is orders of magnitudes slower than a modern CPU.

When a computer does arithmetic - I suppose it involves an analogous division of labor, in which the overall arithmetic task is broken down into a sequence of elementary operations which are directly computed by logic gates, the outputs of which are then combined to give the final answer. One significant difference is that a computer does not learn to do arithmetic, it is simply loaded with a program (or built with a hardwired circuit) which automatically treats an input as an arithmetic task. Humans, on the other hand, learn arithmetic the way a neural network learns to classify, through a statistical process of generalization subjected to positive and negative reinforcement.

But in the end I guess it's size and speed which is the core of the difference. A CPU is simply much smaller than a brain, and transistors are faster than neurons; so signals in a computer travel faster across shorter distances.
 
  • Like
Likes eloheim
  • #13
mitchell porter said:
Typical humans have memorized addition and multiplication tables for single digits, and then have procedures that they follow for arithmetic tasks involving multiple digits.
That's when they got their mathematical skills from 'school' / formal learning. There are many individuals who just 'know' the score at darts and, not only that but score will instantly give them the combination of two or three darts needed for 301. I have taught 'catch up' maths to apparently struggling secondary school kids and they just 'know' the answer many times without knowing how. As they say "it just is". Try them out on their times tables and they struggle. Strange.
 
  • Like
Likes Nik_2213
  • #14
sophiecentaur said:
Try them out on their times tables and they struggle. Strange.
It is not strange at all. The brain processes information much faster on a less conscious level, like the extremely complex arithmetic needed to coordinate muscle movement to throw a ball or to form speech. The research behind speed reading shows that the visual cortex can catch information and feed it straight into the area where we process, understand and remember. But that requires that we mentally bypass the areas where we transform text or numbers into words and words into sounds. That extremely tedious process is often out of phase with the actual information processing, resulting in the brain being fed information twice, which then leads to confusion. Bypassing the conscious to improve performance is known from many other areas: Researchers dream themselves to solutions, zen archers become one with their goal, musicians flow with their music, sheep counters just see how many sheep there are etc. Best way to break the top performance of your tennis opponent is to draw her/his attention to her/his own game. School teachers need to check if students really can read by asking them to read out loud, forcing them to train a conscious, but much less efficient method of reading – or doing math. The conscious brain is great for non-routine problems, like creating knew knowledge or predicting future consequences of changes in our routines. But it is very slow, and therefor doing arithmetic consciously is very slow or even fails if a less conscious, faster process already is in place and working.
 
  • #15
Flisp said:
It is not strange at all. The brain processes information much faster on a less conscious level, like the extremely complex arithmetic needed to coordinate muscle movement to throw a ball or to form speech.
Strange in the sense that it surprises most people when it is demonstrated to them. One can become 'familiar' with such things but Science has a long way to go if they are to be explained in a complete way.
I would say that the complicated 'bodily functions' that we accomplish (including the ones you describe) are not 'arithmetic' (they are essentially analogue and sense-related, for a start) so they are not really comparable with arithmetic feats.
What I find strange and impressive is that feats of arithmetical (and other logical) manipulation have no apparent context in human evolution. Maths has only been around for a few thousand years max and there would have been no 'Darwinian' advantage which could allow a maths ability to survive and reproduce better. That qualifies for the description strange, I think. Furthermore I found it strange that the inmate arithmetical ability was not confined to high academic performers.
Education (like Medicine) has a long way to go if it's to optimise the progress of all kids and not just the ones with conventionally assessed abilities.
 
  • Like
Likes otownsend
  • #16
sophiecentaur said:
I would say that the complicated 'bodily functions' that we accomplish (including the ones you describe) are not 'arithmetic' (they are essentially analogue and sense-related, for a start) so they are not really comparable with arithmetic feats.

This is the view of Environmental (ECO) Psychologists like James Gibson.

https://en.wikipedia.org/wiki/Ecological_psychology

Cheers
 
  • #17
otownsend said:
So then what makes computers more efficient at processing information?
Even if you only look at the mere thinking part, we can not (yet) build a 100 Tflop computer or network that only weighs 1.5 kg.
And if you look beyond the mere processing unit and include energy supply, self-repair system, storing devices, memory devices, frame, temperature regulator, sensors, manipulators, transports system etc of an humanoid robot vs a human being the human being is still the most compact and efficient information gathering, processing and executing system on this planet, and will continue to be for some years.
 
  • #18
sophiecentaur said:
I would say that the complicated 'bodily functions' that we accomplish (including the ones you describe) are not 'arithmetic' (they are essentially analogue and sense-related, for a start) so they are not really comparable with arithmetic feats.
We perceive them as analogue, but on nerve-level they are distinct, almost binary. A nerve either fires or it does not. And it fires only if a distinct number of impulses arrive at the nerve within a certain time frame. The actual processing of information is very similar to the workings of a computer (e.g. in a convolutional neural network). The billions of pixels of an image (coming from the retina) are compared, sorted, grouped, suppressed or exaggerated on many increasingly abstract levels in order to extract content. Despite the differences between brain and computer, the actual computation differs very little, and can be performed on either “device” with very much the same result.
sophiecentaur said:
What I find strange and impressive is that feats of arithmetical (and other logical) manipulation have no apparent context in human evolution.
Arithmetic is in its core an art of abstraction. To count things means to disregard differences and focus on common features. The root of the word “each” is literally “look alike”. And this art serves indeed survival. To very fast check if everybody in the group is accounted for (yes we are all here, all ten of us), or to compare two sizes (they are three more than we, or, there are enough apples for each of us). Some birds can count the number of eggs in their nest. The ability to compute ranges, times, speeds, distances, forces and supplies is as old as life, using any device available, proteins, fatty acids, ion-concentrations.
Humans have been counting and computing things in the sense the OP asked and on a more abstract level for at least 20000 years (Ishango bone). It is a fair guess that simple and lower numbers (one, two, one handful, many) have been a part of human language and thinking for basically as long as there is language. And with the presents of the abstract concept of numbers comes the possibility of abstract manipulation, hence arithmetic.
However, the investigation of the properties of numbers is indeed not so old, and most people find that to this day utterly useless. And doing math in the head is troublesome for most.
The reason for that might be found in how numbers can be represented in computers vs brains.
Logical gates can easily be assembled so they can represent precise numbers of basically any value.
In brains, however, representation of numbers requires the firing of nerves. A nerve can fire at a maximum of 300 times per second. That means that a single nerve can not distinguish between more that 300 distinct values per second (when sending). In fact, since the frequency of beta waves lies in the range of 14 to 30 Hz, nerves can only efficiently represent numbers up to about 20 (300/14), preferably 10 (300/30) to allow maximum speed. Most people can easily calculate in that range and certain birds can do that as well. Above that we need to train special techniques. That requires to involve more nerves and that often leads to confusion, even frustration, until properly trained and synchronized. A feat many people never master (or feel that they have any use for it in their lives, for that matter). And today, with ubiquitous calculators, many can’t even calculate simple numbers any more. But, and here I agree totally with you, ...
sophiecentaur said:
Education (like Medicine) has a long way to go if it's to optimise the progress of all kids and not just the ones with conventionally assessed abilities.
… what people, kids and teachers CAN DO and what they COULD DO are two very different things.
 
Last edited:
  • Like
Likes sophiecentaur
  • #19
@Flisp Loads of good stuff in that post. I guess that what I have been trying to get at is the difference between brain processes that based on formal methods and process that are innate or learned in an informal way. In the last very few thousand years (or less) Maths, in the broadest sense has changed the way that we approach and solve many problems and, at least as importantly, the way we communicate methods.
There has been a sequence of breakthroughs, the first being speech, the second being writing and the third has been the use of Maths to enable communication of ideas and solving problems.
I ask myself which way round it has been - was it evolution of the brain (in a pretty short time) or was it that we were able to use the existing hardware in a novel way? If the latter is true then there must be much more in common between natural and formally presented problems than we have appreciated. The alternative explanation would have to involve some 'intervention' by a supernatural inventor of some kind. As an atheist, I find that difficult to contemplate and the idea certainly doesn't fit in with PF's ideals.
 
  • #20
sophiecentaur said:
Loads of good stuff in that post.
Thank you. So kind!
sophiecentaur said:
I ask myself which way round it has been - was it evolution of the brain (in a pretty short time) or was it that we were able to use the existing hardware in a novel way?
I seen no reason for neither fast evolution nor divine intervention. What we have developed are the tools. One can simply chop down a tree faster with an iron ax than a stone one. Our brains evolved together with sign- and spoken language, the abilities of our hands and the rising complexity of our social groups for many hundreds of thousands of years. Our fragile bodies required extreme skills to survive, to use natural materials for housing, clothing, long-distant communication, heating, weapons, storage of supplies etc. Without maps and books one needs a formidable memory. Especially during times of trouble, diseases, or climate changes our creative brain served us well. Anyone who has tried to live simple knows that it is a full-time job, demanding all your wit and senses.
I think the rapid development we see today requires nothing new, nothing special in our brains. There are actually very few people in the world who are great mathematicians. In earlier times those would have improved on the shape of the stone ax or how to make pottery. Most people simply use the methods someone else has developed earlier. It does not require much training to follow some algorithm. However, the sum and quality of all the tools we have access to and the increasing number of people who have access and time to use them, is breathtaking, giving us the illusion that even we have changed. We might, in a small way, but in general, the anatomy of our brain is said to be more or less unchanged for the last 200 000 years (give or take times and places of poor nutrition).
As far as I know, people coming from populations where no sophisticated math existed do not perform better or worse once educated.
sophiecentaur said:
I guess that what I have been trying to get at is the difference between brain processes that based on formal methods and process that are innate or learned in an informal way.
I think, we can see two parallel things happening when people (brains) use math. One is that people use a formal method to arrive at some solution, simply following the instruction or using the tool without any (deeper) understanding of how it actually works. It gives us the illusion of mathematical power, but there need to be none. Some clicks on the keyboard and voila there is the root of 17. Bypassing all inner process, just performing some sequence with your fingers on your smart phone.
The other would be that when people use mathematical tools they transform them in their brains to Lego or landscapes, colors or music and play with them until a problem looks solved. Then this play is translated back into the language of math. Once written down in that language it is basically unintelligible even for skilled mathematicians, until translated into some inner image they can grasp.

When designing a computer program I see some inner dance of groups and entities, they move, change color and shape, merge and separate. I turn that dance into code that works, but the code tells me nothing. It is that inner dance that guides me. Without the code, without a computer to run the code on, like 200 years ago, I could have used that vision to compose music, tend my garden or create beautiful food. And 200 thousand years ago I may have danced with my fellow hunters to catch our prey.

And I believe, to come back to the original question, therein lies the power of the human brain, when compared to computers. The ability to transform a problem. To create that informal inner dance of change and use it as a guide when using formal methods and tools to verify or execute the solution. As far as I know no software has been written yet, that can do that.
 
  • Informative
Likes Klystron
  • #21
Flisp said:
The ability to transform a problem.
That's it. Identifying the tools one already has in order to solve a new problem. Transforming the problem into something for which the tools exist. And I imagine the communication between individuals speeds the process in a factorial way.

You have to admit that the emergence of Maths is a hard one. It must say something about the Universe that our Maths does such a good job (so far, at least) of nailing the way it behaves. A sort of equivalence between the physical world and the other side of things that we call Maths. That sounds a bit arbitrary.
 
Last edited:
  • Like
Likes Klystron
  • #22
sophiecentaur said:
It must say something about the Universe that our Maths does such a good job (so far, at least) of nailing the way it behaves.
Our universe is an extremely well ordered place, where every particle or group of particles strictly obeys the rules that keep it all running. Every living system has somehow managed to capture some of these rules, relies on them and is able to make predictions based on them. Math is one of many attempts; in some areas extremely powerful and successful in others not so.
Tiger Woods’ brain calculates in seconds how muscles have to contract to hit the ball at such speed and angle that it flies some 150 meters, hits uneven and tilted ground, reverses its direction due to its under-spin and rolls back 4-5 meters into the hole. Jan-Ove Waldner (table tennis), relaying on the smudged millisecond images he got from his opponents movement, jumps to some point 4 meters from the table, moves his bat trough the air, and, his back turned to the table all the time, hits the ball. The ball flies in a high arch and hits the table no more that a centimeter from its corner. Waldner never even turned to look. Asked afterwords he said he didn’t need to, he had ”such good control of the ball”. While the equations to make these incredible calculations and execute them in real time are embedded into the brains and bodies of these “warriors of applied physics”, including all the different materials, speed, spins and accelerations, wind, humidity, turbulence and what not, no physicist comes even close to similar results using the best equations and the fastest computers, even if one would give them all the computing time in the world.

The “superior speed of computers” the OP sees, may have a very different cause. While basically everyone can learn to read text, reading equations is a pain for most (even for me and I am both fond of and fairly good at math). Maybe it is math and arithmetic itself that for many simply feels wrong. It lacks the dynamics, the flow and movement we experience in life. The beauty of the dance is lost somewhere. Most people can “intuitively” understand most physical phenomenon. Even quantum-mechanical and relativistic. But present the equations and most shy away.

Math can map the Universe (or 5% of it:-), but only with utter difficulty. Even if we succeeded to unite QM and relativity, how many would be able to understand the equations? Computers will be able to do calculations, because they are based on the same math. But no human, not even the most talented, will be able to use those equations without artificial aid. In my view that is very wrong. Math should be accessible to everyone. Equations should be a usefull tool describing and reflecting the beauty of the Universe, not a mountain of pain to climb and a thicket of thorns to penetrate. While “everything flows”, I fear that the rigidity and oversimplification of arithmetic at its base leads to unfathomable complexity when trying to apply math to reality. The presents of the value of zero in the mathematical system creates trouble, not present in reality. There is no glitch in reality when an apple is shared by no-one and there is in this Universe no event that adds upp to truly nothing. But some time, thousands of years ago, we started on a path into this direction and there is no easy going back. So, we marvel at the speed with which computers calculate, dazzled and blinded, and miss the fact, that their calculations lack life and can not catch the reality of a single leaf falling off a tree.
 
  • Like
Likes otownsend
  • #23
Flisp said:
But it is very slow, and therefor doing arithmetic consciously is very slow or even fails if a less conscious, faster process already is in place and working.

What do you mean by "less conscious"? I have an intuition about what you mean, but I am curious how it has been defined in the research you are mentioning. Perhaps something related to the use of the pre-frontal cortex (where our ego exists) and amygdala (emotional response)?
 
  • #24
otownsend said:
What do you mean by "less conscious"? I have an intuition about what you mean, but I am curious how it has been defined in the research you are mentioning. Perhaps something related to the use of the pre-frontal cortex (where our ego exists) and amygdala (emotional response)?
Researchers from different fields have very different definitions of “consciousness”. And since you ask “a very interdisciplinary question” different readers of this post will have different ideas and concepts and that can cause much confusion. The concepts of emotion, subconsciousness and consciousness had until recently been more or less taboo in disciplines like physics, math and computer science. But in more and more areas researchers and users prefer a “result over equation” approach and simply let some AI application solve the problem. The other day I witnessed a foreign exchange micro-crash that only lasted some seconds. Probably some simpler applications “panicked” while some “emotionally more stable” programs earned their owners millions. Such events demand from researchers to deal with the issues of machine emotion and consciousness with increasing urgency. There is in our brains no strict distinction between conscious, subconscious and emotional thinking. All parts of the brain are interconnected and all activities do interact in gradients. And when we shape AI-applications we must be aware that such phenomena might emerge even if not intended.
When we learn something consciously, when we follow some scheme, algorithm or instruction, we do that by concentrating on every step we take, regardless whether we learn how to read, calculate, ride the bike, play an instrument or do some exercise.
When we then repeat that scheme over and over again the process becomes increasingly automatic, to the degree that our consciousness can “observe” what we are doing. We can watch our fingers typing, but the movements are too fast to follow consciously in every detail. Even so, we can consciously change spelling and speed. So the process is by no means subconscious. It is a blend of automatized and conscious parts.
That automation happens by itself. The brain tries to make movements more cost efficient and to pick up rules (like gravity, air resistance or muscle endurance) from its real or purely imagined environment. Over time activity withdraws step by step from the prefrontal cortex to the cerebellum. Scans of the brains of people that perform automatized problem solving (like the Rubik’s cube) show that there is basically no activity in the conscious part of the brain. And if any activity appears, performance goes down.

We can also see that these less conscious parts (defined as processes we can not describe verbally) can detect hidden rules faster than the fully conscious parts (defined as processes we can describe verbally). In one experiment people where given cards and asked to play, but where not told the rules. After each move they where shown the score. Believing the were playing at random, their score started to improve after some time, but it took 7 times as long before they could explain, how they did it.
When we learn to read or do math it also happens that these less conscious parts figure out how to do it, but we (or our teacher) might fail to realize that. So we struggle on two (or more) levels, and sometimes one gets in the way of the other.

By the way, computer hard- and software in the area of AI now really starts to lift off. My guess is that within the next decade computers will outperform us on every cognitive level (the Nvidia DGX-1 already runs about as many flops as the human brain).
The answer to your original questions will then be very simple: Yes, computers perform better on all levels and in all areas, because transistors and electricity are so much faster, smaller and cheaper and can form much faster networks over long distances, than nerves and bio-electrical impulses networking via speech/text/numbers/equations; and optimizing the software is so much easier.
 
  • #25
otownsend said:
What is the fundamental reason behind why modern electronic computers (transistor computers) are faster than humans at doing basic arithmetic?
I think the answer laying in the specific operation structure of both entities. Human brain is accumulating new data on old data and form a plex structure. To access a specific data it must visit considerable amount of nodes. For simple basic arithmetic like addition 121+023, it seek the answer from stored info. it is not processing the addition! it seek memorized result. it looks stored info set { add {1,0}, which is 1, add{1,1} which is 2, ……. add{1,9} which is 0,and 1 carry}.
seek add{1,3} find (4)
seek add{2,2} find(4)
seek add{1,0} find (1) ===> (1)(4)(4) 144

Computer uses nand gates to implement a flip-flop which are components of an adder circuit. two number exist in binary form on adder input (111 1001‬){121} & (0001 0111){23}. When clock signal received all binary digits added concurrently. very fast! not seeking in stored info.
(There is no multiply unit in computer. Therefore computer performs repeated addition for multiplication)

Human brain is slow in basic arithmetic due to seeking answer addition {x,y} for every digit pair. But speed is same in seeking more complicated info that stored in neuron plex cluster. This structure also good for seek info's affinity (as in big data) which is big trouble for computers.
 
  • #26
TMT said:
(There is no multiply unit in computer. Therefore computer performs repeated addition for multiplication)
There most certainly IS a multiply unit in computers.
 
Last edited:
  • #27
phinds said:
There most certainly IS a multiply unit in computers.
are you sure? As far as I know, multiplication is done with a micro code implementing "russian peasant" algorithm "add & shift" I did not seen multiplication hardware in ALU. if you sure I will be glad to see this circuit. Can you give me a link?
 
  • #28
TMT said:
are you sure? As far as I know, multiplication is done with a micro code implementing "russian peasant" algorithm "add & shift" I did not seen multiplication hardware in ALU. if you sure I will be glad to see this circuit. Can you give me a link?
The Russian peasant algorithm is no longer used as far as I know. You should be able to find plenty of stuff on the Internet. For floating point multiplication in particular there are FPUs (floating point units) which are specialty hardware.
 
  • #29
  • Like
Likes TMT
  • #30
thank you jim mcnamara for your contribution was very valuable for me (to upgrade my knowledge) , as my education (1967), those stuff was named as micro code (result obtained by execution of several other basic operations).
 
  • #31
TMT said:
To access a specific data it must visit considerable amount of nodes. For simple basic arithmetic like addition 121+023, it seek the answer from stored info. it is not processing the addition! it seek memorized result.
I seriously doubt that we know exactly what happens in the brain when someone is calculating (if you had some article I would be thrilled). But even if your model might be correct when the network is insufficiently trained it is certainly not correct when a person is trained to do – well – basically anything. During training neural pathways are reinforced until operations can be performed without search.
How this happens can be seen in the training of calculation champions.
First students are trained on the Chinese abacus. The math-problem is translated into some initial position of the beads. The beads are then moved according to specific rules using the fingers. When all movements are executed the result can be translated into numbers from the final position of the beads.
The use of the abacus suits the human brain very well since it is very good at manipulate things with fingers, at processing visual information and at imagining visual change. The actual calculation is as such transformed into images of and changes in bead position, which the brain can handle much easier than abstract numbers. (Just as masters in memory use visual images and stories to remember thousands of numbers)
When the student becomes fast and flawless in the use of the abacus, he/she is trained to do it blindfolded, effectively reinforcing the areas that imagine the movements. From here on the brain performs no more searches. All movements are automatized. Translations and manipulations are executed by dedicated nerves.
When that is achieved the student starts to perform the movements without abacus, but still moving the fingers. This reduces the total number of nerves needed to perform. And even more so in the last step, when the finger-movements are reduced to mere flicker so no time is wasted. All movements are visualized and calculation happens very fast.
TMT said:
(There is no multiply unit in computer. Therefore computer performs repeated addition for multiplication)
I’m sorry to say, but this is definitely wrong. Computers use the so called Booth’s algorithm or something similar, which performs a combination of bitshift and addition, much, much faster than repeated addition. CPUs have dedicated circuits for that.
 
  • #32
Hi @otownsend ,
I haven't read all the posts in this thread, but I wanted to comment on this:

otownsend said:
Summary: I'm trying to understand how a brain v.s. a computer computes and why computers are immensely better than humans at certain tasks.

So then what makes computers more efficient at processing information?

There is no computer built or programmed yet that is even close to the language processing abilities of an average normal human. By far.
 
  • Like
Likes sophiecentaur
  • #33
Flisp said:
I’m sorry to say, but this is definitely wrong. Computers use the so called Booth’s algorithm or something similar, which performs a combination of bit shift and addition, much, much faster than repeated addition. CPUs have dedicated circuits for that.
if you look the Booth's algorithm, it expressed in a loop and loop controlled with conditional branch "32nd repetition?" this proof of no dedicated circuit for multiplication in computer but there is a micro code is doing multiplication executing an algorithm (using existing circuits like as full adder) 32 times.
I have not read any article about these. but I have some theories about learning based on several real life biological experiments. But for you: may conduct a simple experiment with infants whom learning addition. if you practice him/her on addition, any selected number pair like 3&4 as rarely but make repeated exercises more frequent, with other numbers (having no 3 or 4) at test time the number containing 3&4 his/her operation take relatively longer against other combinations, since 3+4 not memorized well. (same with multiplication operation)
You must not take search process in neural pathways are similar as computer. the neural cell will produce a signal to another related neural cell in cluster. not seeking 1 by 1 as computer. you make an analogy with object in computer program when you send {+4} signal to object {3} it will respond back 12. neural cells in brain are ordered in cluster structure. all has a link to others. when you get addition request only adding cluster activated. then when received number 3 the cell {3} in cluster activated. and receiving number 4 cell {4}in cluster activated. then both cell produce signal in cluster that activates cell {12}in cluster.
there are also procedures memorized (like as micro codes) as your Booth's algorithm. starting from right most numbers repeat process until left most number.
The visual memory, sound memory are different inputs. you may consider cerebellum as co processor where muscle controls stored. specific rules of the fingers movement are stored at there abacus learning stored some info in cerebellum when activated cerebellum produce series of action via fingers. when beads are moved . then visual memory takes action takes picture of instants and re activate cerebellum... at the end visual memory translates resulted beans to numbers.
But all those context show human brain have no mechanism to perform arithmetic operation. it has great ability to store and access to stored one in short time. if you access a stored information more frequently the access time will be reduced.
 
  • #34
I found that post really good fun @TMT
The way hardware is used to optimize speed is nothing short of magic. I understand that different processors use their own methods. (At least in floating point and graphics routines) I imagine the human brains have an even wider range of methods.
 
  • #35
sophiecentaur said:
I found that post really good fun @TMT
I imagine the human brains have an even wider range of methods.
I agree with you! it has wider range of METHODS. (algorithms) but no structure to perform arithmetic operation. it performs a store and recall in all of those process. I mean there is no adder structure to sum up 3 & 4 to produce 7. but it stores "3+4" and result "7". it also stores a iterative method for multi digit add (single digit add and use carry digit) or it may prefer doubling some time like "222" + "222" as "222" * "2" = "444" but again it will recall "4" as result of "2*2" operation.
I tried to emphasis brain is using recall structure as basis of operations.
Computers also uses logic gates and components made off from those gates. it utilize several components chaperoned by an other component implementing an algorithm. Multiplication in computers is done in this manner. there are no component like adder or shift register or flip flops for multiplication but a control components constructed from adder & shift register to a comply multiplication process.
The recall process in brain is not a seek and find in stored array. it is not seeking "3+4" among bunch off numbers. when "3+4" signaled, "7" will be signaled spontaneously.
 
<h2>What is the efficiency of computers compared to human brains?</h2><p>The efficiency of computers compared to human brains is a complex and debated topic. It is difficult to directly compare the two as they function differently and have different capabilities. However, in terms of processing speed and storage capacity, computers are generally considered to be more efficient than human brains.</p><h2>Can computers perform tasks faster than human brains?</h2><p>Yes, computers can perform certain tasks much faster than human brains. This is because computers are able to process and retrieve information at incredibly high speeds, while human brains are limited by biological factors such as nerve conduction speed.</p><h2>Are there any tasks that human brains can perform more efficiently than computers?</h2><p>Yes, there are certain tasks that human brains can perform more efficiently than computers. For example, tasks that require creativity, emotional intelligence, and critical thinking are still better performed by humans. Additionally, humans are better at adapting to new situations and solving problems in unstructured environments.</p><h2>What factors contribute to the efficiency of computers and brains?</h2><p>The efficiency of computers is largely determined by hardware components such as processors, memory, and storage. On the other hand, the efficiency of human brains is influenced by factors such as genetics, education, and experience. Both computers and brains also rely on efficient algorithms and processes to perform tasks.</p><h2>Can computers ever match the efficiency of human brains?</h2><p>It is difficult to predict if computers will ever match the efficiency of human brains. Some experts believe that advancements in technology and artificial intelligence may eventually lead to computers surpassing human brains in certain areas. However, others argue that the complexity and adaptability of the human brain may always give it an edge over computers.</p>

What is the efficiency of computers compared to human brains?

The efficiency of computers compared to human brains is a complex and debated topic. It is difficult to directly compare the two as they function differently and have different capabilities. However, in terms of processing speed and storage capacity, computers are generally considered to be more efficient than human brains.

Can computers perform tasks faster than human brains?

Yes, computers can perform certain tasks much faster than human brains. This is because computers are able to process and retrieve information at incredibly high speeds, while human brains are limited by biological factors such as nerve conduction speed.

Are there any tasks that human brains can perform more efficiently than computers?

Yes, there are certain tasks that human brains can perform more efficiently than computers. For example, tasks that require creativity, emotional intelligence, and critical thinking are still better performed by humans. Additionally, humans are better at adapting to new situations and solving problems in unstructured environments.

What factors contribute to the efficiency of computers and brains?

The efficiency of computers is largely determined by hardware components such as processors, memory, and storage. On the other hand, the efficiency of human brains is influenced by factors such as genetics, education, and experience. Both computers and brains also rely on efficient algorithms and processes to perform tasks.

Can computers ever match the efficiency of human brains?

It is difficult to predict if computers will ever match the efficiency of human brains. Some experts believe that advancements in technology and artificial intelligence may eventually lead to computers surpassing human brains in certain areas. However, others argue that the complexity and adaptability of the human brain may always give it an edge over computers.

Similar threads

  • Programming and Computer Science
Replies
29
Views
2K
  • Sticky
  • Programming and Computer Science
Replies
13
Views
4K
Replies
2
Views
767
  • STEM Academic Advising
Replies
24
Views
2K
  • STEM Academic Advising
Replies
2
Views
776
Replies
1
Views
698
  • Computing and Technology
Replies
10
Views
2K
Replies
9
Views
600
  • STEM Career Guidance
Replies
5
Views
9K
Replies
1
Views
527
Back
Top