Difference between a computer and the human brain

Click For Summary
SUMMARY

The discussion centers on the fundamental differences between computers and the human brain, emphasizing that while computers process data through fixed programs, the human brain can modify these programs based on contextual usefulness. Participants highlight that the brain operates with a complex network of interconnected processors, capable of evolving its processing methods, akin to genetic algorithms. The conversation also touches on the limitations of current computational models, such as LISP, and the need for advanced hardware capable of real-time processing at petaflop speeds to emulate human-like intelligence.

PREREQUISITES
  • Understanding of computer programming concepts, particularly compilers and data processing.
  • Familiarity with neural networks and their relation to brain function.
  • Knowledge of genetic algorithms and their application in evolutionary computation.
  • Basic grasp of computational performance metrics, such as petaflops.
NEXT STEPS
  • Research the principles of genetic algorithms and their implementation in artificial intelligence.
  • Explore the architecture and functioning of neural networks in relation to human cognition.
  • Investigate advancements in hardware capable of achieving petaflop processing speeds.
  • Study the concept of gradient descent and its applications in machine learning.
USEFUL FOR

This discussion is beneficial for computer scientists, AI researchers, cognitive scientists, and anyone interested in the comparative analysis of human and machine intelligence.

verdigris
Messages
118
Reaction score
0
Isn't the main difference between a computer and the human brain this:
a computer uses a programme to process data but the human brain
processes data and processes programmes which process data (to improve the data processing programmes and thus increase the brain's effectiveness and efficiency) - in other words the human brain treats programmes as data?
 
Last edited:
Biology news on Phys.org
What do compilers do?
 
I wasn't thinking of compilers, more this:
a computer could add two numbers to produce a sum.
But a human being can say that sum isn't useful,I'll add a third number.
A compiler can't make this decision. - a compiler works by using a set of rules.
People can change the rules and decide if the result is useful.
 
Last edited:
A human mind is more than the sum of its' parts, a computer is not.
 
ChaoticLlama said:
A human mind is more than the sum of its' parts, a computer is not.

right- a mind is more than the sum- it is 2^n where n is the number of parts:wink:

a classical computer is the same- but we haven't figured out the software equivalent yet- LISP was very close in terms of overall function- but you really need a modular hierarchy of networked programs- nor have we developed hardware that could run it in real-time you would need about 10^16 operations per second: petaflops
 
Last edited:
setAI said:
you would need about 10^16 operations per second: petaflops

how many intel-core 2s is that? 'cause mine's got two.

anyway, I don't think sheer processing power is the only problem (although it obviously plays a part); the brain also has hundreds of different "processors", each working at different processes but still connected (millions of connections) to each other... the brain also has the ability to re-connect one processor with another one, make different connections depending on what is needed... we would need something equivalent to that in a computer to be close to a human brain.

I think it's very possible to create a computer as smart as a human brain (and smarter)... only problem is we don't even fully understand the human brain yet... hell, we don't understand the fly's brain! ... so trying to build one is kind of backwards... kind of like trying to build a car before you understand how a wheel rolls...
 
Last edited:
I reckon the human brain does the following:it represents a programme to perform some function (such as picking up a pen) as ,say, binary 1100 (where a 1-neuron can be fired if a stimulus is given to it but a 0-neuron cannot be fired).It then varies the programme and changes it a bit to say,binary 1110 and sees if 1110 is better at getting the hand to pick up a pen.If so then 1110 is promoted to being the current programme and 1100 is put in memory in case it's needed in future for some reason.So the brain
changes a number and sees if it likes the new one.To decide if it likes the new number, the brain checks if the new number does the task quicker than the old number,as efficiently and as effectively.So it has a set of neurons that amount to an equation that weighs up these qualities and into which the variables efficiency,speed and effectiveness are input.
 
verdigris said:
I reckon the human brain does the following:it represents a programme to perform some function (such as picking up a pen) as ,say, binary 1100 (where a 1-neuron can be fired if a stimulus is given to it but a 0-neuron cannot be fired).It then varies the programme and changes it a bit to say,binary 1110 and sees if 1110 is better at getting the hand to pick up a pen.If so then 1110 is promoted to being the current programme and 1100 is put in memory in case it's needed in future for some reason.So the brain
changes a number and sees if it likes the new one.To decide if it likes the new number, the brain checks if the new number does the task quicker than the old number,as efficiently and as effectively.So it has a set of neurons that amount to an equation that weighs up these qualities and into which the variables efficiency,speed and effectiveness are input.

I have no idea what evidence you have that the brain works this way, because it seems very unlikely that the brain uses anything resembling a "program" at all.

What you're basically describing, by the way, is a form of evolutionary computation known as a genetic algorithm. The "programs" you hinted at, such as "1110," would be known as "chromosomes" in the world of genetic algorithms.

It is possible (indeed quite likely) that the brain incorporates some form of gradient-descent in its operation. In other words, it begins learning a new task in some arbitrary way, and actively "climbs up hills," gradually changing its functionality until it reaches the most efficient way of doing the task.

Gradient-descent is a general concept, however, and does not have to be related in any way to discrete "programs." Other kinds of systems, like neural networks, also use gradient descent as a primary method of learning.

- Warren
 
In a computer a program is data. This is the idea that out modern computer is based on. It can be programmed arbitrarily.

chroot, you said "other kinds of systems, like neural networks"... aren't neural networks the current model of the brain?
 

Similar threads

Replies
15
Views
4K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
4
Views
4K
  • · Replies 13 ·
Replies
13
Views
5K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
Replies
2
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 40 ·
2
Replies
40
Views
9K