Wooden Neurons: Will They Be Conscious?

AI Thread Summary
The discussion explores the concept of consciousness in non-biological systems, questioning whether mechanical neurons or a collective of humans mimicking neuronal functions could achieve consciousness. It raises the idea that consciousness may emerge from specific arrangements of matter, regardless of the materials involved. The conversation also delves into the implications of creating perfect copies of individuals and whether those copies would share consciousness or possess a separate identity. Participants highlight the challenges in defining consciousness and the potential for various systems, including computers and even rocks, to exhibit some form of consciousness. Ultimately, the dialogue reflects on the complexities of consciousness and the philosophical debates surrounding its nature and origins.
Meatbot
Messages
146
Reaction score
1
What if you have 10 foot neurons made of wood with mechanical clockwork innards, and they shoot different types of metal balls to each other to send messages. What if they have some mechanical method for reproducing every other function of a neuron? What if they are arranged in exactly the same way the neurons in the brain are? Will that group of wooden neurons be conscious?

What if you take a group of 100 million people and have each one perform the duties of a neuron? Will that system be conscious?

Seems to me you would have to say that it would be conscious, as strange as that sounds.
 
Last edited:
Physics news on Phys.org
where does conciousness occur though? I've always wondered the same sort of thing only imagine we had a machine that can make EXACT copies of you (scanns 100% accurate recreates 100% accurate)

would you sort of share conciousness? or does he have a new concious? is it even conscious at all? hm
 
Sorry! said:
where does conciousness occur though? I've always wondered the same sort of thing only imagine we had a machine that can make EXACT copies of you (scanns 100% accurate recreates 100% accurate)

would you sort of share conciousness? or does he have a new concious? is it even conscious at all? hm

-I think he would be conscious and he would think he was you. If you asked him, he'd claim to be you. He'd have all of your memories. It would be two consciousnesses. I think consciousness is something that emerges from certain arrangements of matter.
 
The China brain does create a mental state. This is yet another one of those intuition pumps that fall to bits when analyzed, just like the Chinese room. Our intuition that it is impossible is just a bias against non-neuron minds, furthered by the implausibility by the scenario. There is a natural desire to locate the mind at a specific point, because the mind feels like one thing.
 
Meatbot said:
What if you have 10 foot neurons made of wood with mechanical clockwork innards, and they shoot different types of metal balls to each other to send messages. What if they have some mechanical method for reproducing every other function of a neuron? What if they are arranged in exactly the same way the neurons in the brain are? Will that group of wooden neurons be conscious?

What if you take a group of 100 million people and have each one perform the duties of a neuron? Will that system be conscious?

Seems to me you would have to say that it would be conscious, as strange as that sounds.
What if you had, say, silicon doped into + & - types and say, had a huge grid of alternating types based on logic, and then decided to pump electricity through it? ;)

I don't see how you could just take a bunch of neurons and throw them together in whatever order you need to when we don't even understand what makes something "conscious". Obviously if you knew then you would have to call it "conscious" regardless of the materials (light interference anyone?).
 
dst said:
I don't see how you could just take a bunch of neurons and throw them together in whatever order you need to when we don't even understand what makes something "conscious".
Yeah...I think only certain arrangements would be conscious in the way we normally understand it, although I think consciousness is a continuum. Some arrangements might be "less" conscious and some "more" conscious, whatever that might mean. Also, some arrangements would operate very slowly, and some could operate faster than our brain. How might it feel to think 100 times slower than you do now?

Who is to say that a rock is not in some way minimally conscious? It has signals that propagate through the vibration of atoms. The atoms respond to each other's vibration. It can take input from vibration of the rock, and from temperature changes. It can change it's structure by melting. This is obviously different from what the brain does, but is it different enough that it's not conscious? I don't know.
 
Last edited:
Well, it could be depending on how you look at it. Why isn't a computer conscious? "It" has a "state of mind" (the content of its cpu registers), it has a "memory", it has feelings ("tiredness" or actually just mechanical wear/fatigue), etc? I mean, sure, you program a PC to do what it does but isn't that what evolution or God effectively did to us? Take 10 normal (i.e. non-mentally dysfunctional) people in a room, shoot their mothers in front of them. All display the same reactions (horror). Take 10 more, repeat, 100 more, rinse and repeat. Any change? No. Some might have different reactions (good riddance, fear, etc). Take 100 computers and load up MS-Paint. MS-Paint loads up. Some will have different reactions (for instance, the Windows based PCs will crash :D). So in any case, how would you logically define "consciousness"?
 
Last edited:
dst said:
Well, it could be depending on how you look at it. Why isn't a computer conscious? "It" has a "state of mind" (the content of its cpu registers), it has a "memory", it has feelings ("tiredness" or actually just mechanical wear/fatigue), etc? I mean, sure, you program a PC to do what it does but isn't that what evolution or God effectively did to us?

Look up multiple draft model of consciousness.
 
dst said:
Well, it could be depending on how you look at it. Why isn't a computer conscious? "It" has a "state of mind" (the content of its cpu registers), it has a "memory", it has feelings ("tiredness" or actually just mechanical wear/fatigue), etc?
Perhaps it is on some dim level.
 
  • #10
Moridin said:
Look up multiple draft model of consciousness.

Yeah, I got this: http://www.conscious-robots.com/en/reviews/theories-and-models-of-consciousness/multiple-draft-2.html

Dennett argues that consciousness is executed in some sort of virtual machine modulated by cultural learning that runs on the brain parallel hardware. This virtual machine installs in the brain is a set of ‘mind habits’, which are not visible at a neuron-anatomic level.

Another analogy would be a hardware abstraction layer. The point being, that there is a soul of sorts but it's just a collection of ridiculously complex objects combining to become even more ridiculously complex, and on top of that, self-programmingly ridiculously complex. In any case, just looking at myself gave me the impression of that quote, long before I read it.
 
Last edited by a moderator:
  • #11
Hi Meatbot – are you familiar with functionalism? It was a concept initiated by Putnam back in the 60’s.
Its core idea is that mental states (beliefs, desires, being in pain, etc.) are constituted solely by their functional role — that is, their causal relations to other mental states, sensory inputs, and behavioral outputs. Since mental states are identified by a functional role, they are said to be multiply realizable; in other words, they are able to be manifested in various systems, even perhaps computers, so long as the system performs the appropriate functions. While functionalism has its advantages, there have been several arguments against it, claiming that it is an insufficient account of the mind.
Ref: http://en.wikipedia.org/wiki/Functionalism_(philosophy_of_mind)"

So yes, the proposal for a mechanical consciousness per your OP is often believed to have mental states per functionalism.

Meatbot said: Who is to say that a rock is not in some way minimally conscious? It has signals that propagate through the vibration of atoms. The atoms respond to each other's vibration. It can take input from vibration of the rock, and from temperature changes. It can change it's structure by melting. This is obviously different from what the brain does, but is it different enough that it's not conscious? I don't know.

Putnam has since written a book, “Representation and Reality” (1988) in which he does a 180 and decides functionalism is false, and therefore computationalism fails. In the appendix of his book, he makes the often quoted statement, “Every ordinary open system is a realization of every abstract finite automaton.” This argument basically proves that even a rock can have conscious mental states for exactly the points you make. Putnam points out that such states as are in a rock are purely symbolic and arbitrarily chosen, which is obviously correct and accepted by the most ardent computationalists like Dennett and Chalmers.

To defend computationalism, Chalmers wrote a paper entitled, “Does a Rock Implement Every Finite-State Automaton” (can be found online). Similar attacks on Putnam’s argument were done by Copeland, Christly and Endicott.

In Putnam’s defense, Bishop wrote numerous papers which strongly support Putnam’s view, as did a philosopher named Maudlin.

There are similar attacks on computationalism by Harnad who points out that computationalism is symbol manipulation and he comes up with an argument called “the Symbol Grounding problem”. Searle also attacks computationalism by noting that computations are not intrinsic to physics.

Personally, I have to agree with the anti-computationalists. The biggest problem right now with any computational view is simply defining “computation”. What is a computation? The most brilliant minds in philosophy have thus far failed to produce an adequate definition that shows how a computer can be intrinsic to anything physical in nature.
 
Last edited by a moderator:
  • #12
Meatbot said:
-I think he would be conscious and he would think he was you. If you asked him, he'd claim to be you. He'd have all of your memories. It would be two consciousnesses. I think consciousness is something that emerges from certain arrangements of matter.

Well then again, why should he display a different set of conciousness since everything was cloned 100%? What is different that he should share a different set of conciousness?
 
  • #13
I think a mechnical brain would be conscious in direct proportion to how closely it simulated the various features and functions of the brain.

But it would run EXTREEEEEEMELY slowly...
 
  • #14
Sorry! said:
i've always wondered the same sort of thing only imagine we had a machine that can make EXACT copies of you (scanns 100% accurate recreates 100% accurate)

would you sort of share conciousness? or does he have a new concious? is it even conscious at all? hm

I agree with Meatbot that he would be conscious and would probably debate with you about who is the "real you". No real way to resolve who IS the real you after a perfect biological fax is done, although after a period of debate both of "you" would have to admit that your experiencial perceptions have now diverged and the consciousness is no longer identical.

Another amazing thing, when you think about creating mechanical conscious machines, is how we have the sense that our conscious self remains constant for our entire lives, up to about 100 years. How is it that every time we either wake up from sleep, wake up from anaesthesia, etc. that we haven't been "re-booted" and have to start all over with a new consciousness? Or do we, and we don't realize it somehow?
 
  • #15
sysreset said:
Another amazing thing, when you think about creating mechanical conscious machines, is how we have the sense that our conscious self remains constant for our entire lives, up to about 100 years. How is it that every time we either wake up from sleep, wake up from anaesthesia, etc. that we haven't been "re-booted" and have to start all over with a new consciousness? Or do we, and we don't realize it somehow?
Robert J. Sawyer wrote a book, with this as a subplot, called https://www.amazon.com/dp/0765311070/?tag=pfamazon01-20.
 
Last edited by a moderator:
  • #16
Meatbot said:
Another amazing thing, when you think about creating mechanical conscious machines, is how we have the sense that our conscious self remains constant for our entire lives, up to about 100 years. How is it that every time we either wake up from sleep, wake up from anaesthesia, etc. that we haven't been "re-booted" and have to start all over with a new consciousness? Or do we, and we don't realize it somehow?

I think that question has a pretty simple answer.
Your brain with all its 'data' didn't change.
In theory if you die and am resurrected somewhere else with the same brain and memories in another life you would still be 'you', granted you had memories intact.
 
  • #17
octelcogopod said:
I think that question has a pretty simple answer.
Your brain with all its 'data' didn't change.
In theory if you die and am resurrected somewhere else with the same brain and memories in another life you would still be 'you', granted you had memories intact.


I agree with your conclusion but to me it is almost unbelievable that a person can undergo a coronary heart bypass, be put under general anaesthesia for hours, maybe have an accidental seizure or two, and then wake up and in a few hours feels pretty much like they are the same person with the same consciousness that they had for the prior 60 years. I am sure that the brain with all its 'data' did change during that process, but just not in areas critical for consciousness.

Conversely, there are instances of individuals who are otherwise intact but have temporary or even permanent bouts of amnesia, thus apparent disjointed or 'rebooted' consciousness. There must be very specific brain areas which are responsible for maintaining a sense of a lifetime of consciousness, and it may map to areas involved in long-term memory.
 
  • #18
sysreset said:
I agree with Meatbot that he would be conscious and would probably debate with you about who is the "real you". No real way to resolve who IS the real you after a perfect biological fax is done, although after a period of debate both of "you" would have to admit that your experiencial perceptions have now diverged and the consciousness is no longer identical.

Then what happens immediately after cloning? Since every molecular and atomic arrangement in the two bodies are the same, why is it that you should be looking through your body and not his? What happends to your conciousness?
 
  • #19
Oerg said:
Then what happens immediately after cloning? Since every molecular and atomic arrangement in the two bodies are the same, why is it that you should be looking through your body and not his? What happends to your conciousness?
It sounds like you are talking sci-fi cloning. Immediately after real cloning, you are still a single-celled egg.
 
  • #20
DaveC426913 said:
It sounds like you are talking sci-fi cloning. Immediately after real cloning, you are still a single-celled egg.

well i think they were referring to my post not the OP so if that's the case then i WAS talking about sci-fi sort of cloning... with a machine to make perfect 100% copies
 
  • #21
Sorry! said:
well i think they were referring to my post not the OP so if that's the case then i WAS talking about sci-fi sort of cloning... with a machine to make perfect 100% copies

This has always been something that has bothered me, and I take comfort that I do not know the answer, and that maybe our very being and conciousness is unique to us, humans.
 
  • #22
A good scifi book that looks into this sort of consciousness identity issue is Greg Egan's "Diaspora"
 
  • #23
After a hypothetically EXACT carbon copy of your entire being is made (sci fi as of today) you would have two separate consciousnesses, one in each copy, and both consciousnesses would immediately diverge. They would share past memories but would then be just like any other set of identical twins with similar traits but gradually diverging world views. And of course, the unsolvable argument over who is really the valid original being.
 
  • #24
Oerg said:
Well then again, why should he display a different set of conciousness since everything was cloned 100%? What is different that he should share a different set of conciousness?

I would think because consciousness is generated by matter and it's a different collection of matter. If it's not different, then you'd be able to experience the input from both sources. You'd be able to see what the clone sees. I think that would require that consciousness is somehow not only determined by the atoms involved, but also by something apart from them. In that case, the two people would be "connected" by some type of consciousness-bearing medium.
 
  • #25
Moridin said:
The China brain does create a mental state. This is yet another one of those intuition pumps that fall to bits when analyzed, just like the Chinese room. Our intuition that it is impossible is just a bias against non-neuron minds, furthered by the implausibility by the scenario. There is a natural desire to locate the mind at a specific point, because the mind feels like one thing.

You're right Moridin. This topic is about as subjective as it gets.

The terminology has to be defined and I'll start by noting how "awareness" is often mistaken for "consciousness" and visa versa.

I think you can safely say that a machine can be made aware of things around it but to say it has a "conscious awareness" will be a stretch because the word consciousness applies to human, neuron interaction with the environment. The environment can be other neurons in the brain and it can be wind sun and water etc...

You can never get inside another person's head and determine if THEY are conscious or not... so how is it that we can so happily determine the conscious... or unconscious... state of a machine? Turing tests, reflex tests, etc... do not properly gage the existence of consciousness. They help determine awareness of certain criteria but, as we know, what we call consciousness is an elusive target for scrutiny after centuries of use of the word.

I think the subject might be sensitive because the word "consciousness" has become an almost religious term and so, applying it to the function of a machine may appear as blasphemy to those people used to using it with regard to "enlightenment".

The term consciousness has been completely struck from the vocabulary of the Neuro-sciences and has been replaced with "awareness" due to the above complications.
 
  • #26
"Will any physical system that reproduces the functions of a human brain be conscious?"

This is not a scientifical issue I think... this is more like a question about "the philosofical existence of the consciousness"
 
  • #27
Personally I think that any intelligent system is conscious to a certain degree. The more intelligent the system is the more conscious it because. Consciousness should effectively be treated as a perception or experience of the mechanism which enable the system to compute. That need not imply however that consciousness equals choice, choice is of course a virtual concept. Consciousness means that a certain intelligent system experiences its own computation. However it can never the programming which is hardcoded into it. The programming steers the conscious aspect to do certain tasks as one might see it. Compare to the internet, which I believe is conscious to a very large degree, yet it cannot escape its original programming, just as much as humans cannot escape the physical laws in our brain which guide our neurons and eventually cause our actions. There is no 'choice'.

This is the only probable theory I have thus far encountered that could explain why we are conscious yet there still is no choice, if someone has a better one, I am open to it as I have no idea what should be the reason that prompts intelligent systems to become conscious.
 
  • #28
cuddles said:
The more intelligent the system is the more conscious it because.

What would it mean to be more conscious than you are now? I think that degrees of consciousness are on a continuum, but what would it feel like if you were more conscious? Perhaps your mind would be able to process more things at once, or keep more things in active memory at the same time, or think in multiple streams simultaneously. It is well known that the brain shuts off attention to details in the environment that it deems irrelevant. Perhaps more of these details would be part of your awareness?
 
  • #29
I would say the more aware of one's own computation. Though I realize this definition is highly informal but since units for that field are—at least by my knowledge—not yet defined it is hard to give a good formal definition.
 
  • #30
Look at consciousness from a behaviorist's point of view. We infer that certain people or animals are conscious based on their behavior. But there is a trap there. Take for example a person who is conscious but 100% paralyzed due to a medication or a brain injury. Others have no way of identifying conscious activity (without a functional MRI, that is) and for all appearances that person is not a consious being , exhibits absolutely no behavior whatsoever or any apparent recognition of surroundings, yet the person is conscious. So behaviorism isn't the answer. But then the same dilemna appears when evaluating whether a very complex network of computer circuits or a complex network of living beings is conscious. You see vast swarms of bees or birds acting in concert, is the individual bird any more or less conscious than the entire swarm unit of bees?? Is there a quantum unit of consiousness, will it ever be possible to measure whether ANY system has more or less units than another??
 
Last edited:
  • #31
sysreset said:
Look at consciousness from a behaviorist's point of view. We infer that certain people or animals are conscious based on their behavior. But there is a trap there. Take for example a person who is conscious but 100% paralyzed due to a medication or a brain injury. Others have no way of identifying conscious activity (without a functional MRI, that is) and for all appearances that person is not a consious being , exhibits absolutely no behavior whatsoever or any apparent recognition of surroundings, yet the person is conscious. So behaviorism isn't the answer.

I would say that basing belief of others' consciousness based on behaviorism breaks down with a much simpler argument: if we sufficiently programmed a humanoid-like (appearance wise) robot that was intelligent but without any consciousness. From a behavioral standpoint it does all it needs to do and could act just as 'complicated' as a conscious human could.
 
  • #32
cuddles said:
I would say the more aware of one's own computation. Though I realize this definition is highly informal but since units for that field are—at least by my knowledge—not yet defined it is hard to give a good formal definition.

But then again, I would say humans are not in any shape or form aware of their own computation.
The computation the brain does is a purely physical thing, it's neurons, chemicals and electrical signals in the brain, while the choices humans make on a subjective level is not really the computation itself, but rather a side effect of that computation.

A good example would be if a human wants to eat an apple.
He hasn't eaten for a substantial period, there is an apple on the table, and nothing is really stopping him from eating it, if he thinks "hey I haven't eaten in awhile I'm hungry and that apple looks good" how much of that is actual choice?
His stomach already sent signals to the brain, he already liked apples since he was a kid, which makes the brain perceive the apple as an edible and good object before he ever decided to, and he probably already ate several apples which helped the brain become what it is.

I would say that really you can't choose something before your brain has been given any kind of input to process some kind of logical answer within itself regarding the thing in question.

Hence, my conclusion would be that any self awareness regarding choice and self computation would rather be your brain being able to process more information at the same time, which would in practical term mean that a "more conscious" person would probably have a brain that would be a lot more picky about apples.
I say this because (and without going too off topic) the brain in general really does seek to have perfection to whatever it thinks perfection is, and I do believe that the more information the brain has and can process about something, especially at the same time, the more nuances are picked up, and the more imperfections are noticed and NOT discarded, maybe like shape, age, color or whatever else.

However something to note is that I believe any of this is involuntary on the subjective level, it's more controlled by the physics of the brain and to some extent the sensory system of the body too.. Always a slave to those things I would say.
 
  • #33
sysreset said:
Look at consciousness from a behaviorist's point of view. We infer that certain people or animals are conscious based on their behavior. But there is a trap there. Take for example a person who is conscious but 100% paralyzed due to a medication or a brain injury. Others have no way of identifying conscious activity (without a functional MRI, that is) and for all appearances that person is not a consious being , exhibits absolutely no behavior whatsoever or any apparent recognition of surroundings, yet the person is conscious. So behaviorism isn't the answer. But then the same dilemna appears when evaluating whether a very complex network of computer circuits or a complex network of living beings is conscious. You see vast swarms of bees or birds acting in concert, is the individual bird any more or less conscious than the entire swarm unit of bees?? Is there a quantum unit of consiousness, will it ever be possible to measure whether ANY system has more or less units than another??


Indeed. In fact, the question is a very old philosophical one, and is sometimes called "the hard problem" of consciousness, or the mind/brain problem.

There are two stances on it:
- behaviourist/materialist, who claim that "if it walks like a duck, and talks like a duck, it is a duck", and the duck is the prototype of "human consciousness". It is the similarity in behaviour that makes us suppose that other human beings are "conscious" (have a subjective experience). The Turing test is an example of this kind of reasoning, just as is the Chinese room. But there's a confusion here between *intelligence* (= capacity to solve problems) and "consciousness".

- dualist, who takes that subjective experience (consciousness), although coupled to the material world, is nevertheless something else. The "philosophical zombie" is an illustration here. Essentially, dualist arguments are anti-materialist arguments, in that they argue that behaviour or material organization ALONE cannot be taken as irrefutable proof that there is a subjective experience.

I'm personally seduced by the dualist arguments, I have to say. Of course, there is sometimes a confusion between the dualist position (which is just a refusal of the behaviourist position), and a mystical/religious position of "soul" and so on, which is often used by materialists to attack the dualist comments. Dualism doesn't mean mysticism or religion, however. It is just the recognition that subjective experience is not "provable" externally by material, behavioural means.

An argument I always like to use against materialists is that they are too antropocentric, and cannot imagine different forms of consciousness: "does it hurt a rock when it is broken ?"

No matter how silly this question might be at first sight, if you analyse what "causes pain" in a "conscious being", then it is very difficult to give a fundamental reason why a rock cannot be suffering pain from being broken, while you do suffer pain if you break your leg.
 
  • #34
Let us say that pain is something that results from a particular type of electro-chemical reaction. If a rock does not have this electro-chemical reaction, there is no pain.

If qualia is an incoherent concept, then all p-zombies are as well. Pain is not something that you can just strip off a person's mental life without bringing about any behavioral or physiological differences This solves the "hard" problem of consciousness quite nicely. One of the most used counters against dualism is that it is vague and lacks primary attributes (or "substance" as it was earlier called).

There are various other arguments, such as Mary the color scientist etc. but they are not that convincing. Mary has been in a black and white room and can only observe the outside from a black and white monitor. She knows all the the physical facts about colour, including every physical fact about the experience of colour in other people, from the behavior a particular colour is likely to elicit to the specific sequence of neurological firings that register that a colour has been seen etc.

The question is if she will learn something new when she steps out. The dualist would say yes, but are pretty much unable to state what in more than a vague sense. The materialist would simply argue that if she knows everything about it, this grants her omniscience in that area, and will learn nothing knew.

Besides the vagueness of the term dualist and what it is suppose to be that is experience this qualia, I think it is about a being reserved against reductionism in human experience and behavior. p-zombies and so on seems to be intuition pumps.
 
  • #35
octelcogopod said:
Hence, my conclusion would be that any self awareness regarding choice and self computation would rather be your brain being able to process more information at the same time, which would in practical term mean that a "more conscious" person would probably have a brain that would be a lot more picky about apples...the brain in general really does seek to have perfection to whatever it thinks perfection is, and I do believe that the more information the brain has and can process about something, especially at the same time, the more nuances are picked up, and the more imperfections are noticed and NOT discarded, maybe like shape, age, color or whatever else.

I guess my dog, who would eat a rotten apple right out of the trash, is conscious, but less conscious than I.

I notice that a lot of the posts seem to touch upon (but do not explicity state) a link between consciouness and senses such as sight, taste, or sound. "More consciousness" could be imagined to enhance our existing 5 senses, or be imagined to create new ones...

It is interesting to speculate what the side effects of "more consciousness" would be. Is more consciousness better or worse? Would I become super-picky about apples, or more appreciative of the same apples? Or would I become unaware of apples because I am busy with other thoughts?
 
  • #36
Hi Vanesch,
An argument I always like to use against materialists is that they are too antropocentric, and cannot imagine different forms of consciousness: "does it hurt a rock when it is broken ?"
I'm not disagreeing with anything you said, just thought I'd point this one thing out...

People who accept the standard computational paradigm of consciousness intuitively make the mistake of believing the phenomena of various conscious experiences such as the experience of pain, or the experience of heat or orgasm should naturally correlate to those experiences we have. They reason that this behavior we exhibit, such as flinching and subsequent avoidance of pain, is an evolutionary reaction which aids our survival. This is a type of category error which is almost as difficult to understand as the category error of the color red being a property of something we’re looking at. Obviously, the color red doesn’t exist as a property of an object – it is a phenomenon which is created within our brains.

The category mistake I’m referring to is the belief that the computational model allows for qualia to have some kind of influence over our behavior. It doesn’t - and this has some rather drastic implications.

The movement of a switch has nothing to do with what qualia may be experienced. Nor should the subsequent operation of numerous switches have any correlation to specific qualia. Qualia do not influence the operation of switches. Qualia are subjectively experienced and are not objectively measurable. But the operation of a switch is influenced by the application of electric charges, mechanical forces or other objectively measurable influences.

If qualia don’t influence the operation of switches, they can not influence behavior and they can’t come to our aid when we experience a negative influence such as pain. The experience of pain can not influence what switch or set of switches are going to activate so there is no evolutionary benefit to qualia given the computational model. There is no need to associate the qualia of a bad experience with the behavior of avoidance of pain for example. Given the computational model, qualia have no influence over the physical substrate which creates it. Therefore, qualia are epiphenomena given the assumption that consciousness is computational. And if qualia are epiphenomena, then there is no reason for it to reliably correlate to our behavior.

In the case of your broken rock example, the rock breaks because there are internal stresses in the rock which are higher than what can be resisted. There’s no reason for a rock to experience pain if it does the rock no good. Why shouldn’t the rock experience pleasure at being broken? How about an orgasm?

If our behavior can not be influenced by qualia, I’d rather go through life experiencing constant orgasms than pain, frustration, and sadness.

Edit for clarification: The 'category mistake' made by computationalists is that qualia, which is not an objectively measurable phenomena, can have an influence on objectively measurable interactions.
 
Last edited:
  • #37
sysreset said:
I guess my dog, who would eat a rotten apple right out of the trash, is conscious, but less conscious than I. ...

Is more consciousness better or worse? Would I become super-picky about apples...
I'm not sure that your dog's table manners are about consciousness at all. Organisms eat foods they can eat and avoid foods they think will be bad. Your dog will not be harmed by rotten apples. The lowliest of creatures - with no consciousness whatever - will pick foods they can eat.

I once read an article about some highly intelligent birds that were able to abstract situations to a surprising degree. They would not simply hide food from competitors, they would deliberately set up decoys. It could be shown, too, that this was not merely instinctive behaviour, as the birds did not set up the decoys when no other birds were watching.

The article attempted to categorize different levels of consciousness:

I know - "I have a treat"
I know I know - "I know I have something of value and will hide it for later"
I know you know - "I know you saw me place the treat under that rock and can deduce your own logic about it."
I know you know I know - (ah I can't remember this one)

Consciousness thus is described as an ability to empathize - to not merely conceive of your own thoughts, but to recognize that others perceive their thoughts too, and that there's a connection.
 
Last edited:
  • #38
sysreset said:
I guess my dog, who would eat a rotten apple right out of the trash, is conscious, but less conscious than I.

So you are implying that consciousness is something quantifiable just one-dimensionally, even though the method of measuring it may or may not be known to us presently?
 
  • #39
Shooting star said:
So you are implying that consciousness is something quantifiable just one-dimensionally, even though the method of measuring it may or may not be known to us presently?

I was actually just making a slightly tongue-in-cheek remark in response to octelcogpod's earlier post equating "more consciousness" with, for example, being "more picky about apples." But you are correct. We don't know how to measure consciousness, and even if we did, it would very likely be a multidemensional phenomenon.

But my flippant remark does touch on something that is more serious, which is the consciouness of animals. I think when discussing whether machines can ever achieve consciousness one must first explore what is the nature of animal consciousness, which is certainly different than human consciousness. Strictly speaking about senses, animals can have more enhannced awareness than humans. For dogs, obviously the sense of smell and hearing are so many magnitudes higher than humans it makes their "experience" a bit hard to comprehend.
 
  • #40
Meatbot said:
Will any physical system that reproduces the functions of a human brain be conscious?

If you accept the actual definition that society uses for consciousness, then the answer is no. It seems to me that most folks define consciousnesses: "The sum of all mental processes and experiences that only humans possess."

Though to be fair to you, the answer really does depend on what the word means. "Consciousness" is one of those unfortunate words that has no specific meaning. The term is a horrid conflation of various concepts that, to me, seem only related to each other through a concept like "things that humans feel about experiences." Until it does you can't answer questions like "Does this meat possesses consciousness?"

It would be just as productive to ask if a given structure possesses "fruglyalitousness," which is a word I've just made up that means: "the specific quality that no thing possesses."

In summery neither your complex wooden thingy nor Humans are conscious.

Which is why this thread has joined all other discussions this topic throughout time and space, in the fate of becoming an attempt to agree on a definition that will satisfy everyone involved.
 
Last edited:
  • #41
Quatl said:
In summery neither your complex wooden thingy nor Humans are conscious.

"Most" humans are conscious by definition. You can try to work out what can be meant by "most". If you are shifty enough, you can say that now defining consciousness has become equivalent to defining "most", and the discussion has remained at the same degree of irresolution.

I don't want to enter into semantics now, but I don't subscribe to your summary. Do you?
 
  • #42
Shooting star said:
"Most" humans are conscious by definition. You can try to work out what can be meant by "most". If you are shifty enough, you can say that now defining consciousness has become equivalent to defining "most", and the discussion has remained at the same degree of irresolution.

I don't want to enter into semantics now, but I don't subscribe to your summary. Do you?
Actually I do, but not in the sense that I think you're asking. I think that Consciousness is not very coherent. I know that I have experiences, some of which even seem to be about experiencing other experiences. I assume that most other humans are similar, and furthermore that other things seem to have them too, in varying degrees.

I think it is more useful to use the word only as loosely as it is defined. That is as a broad category used to describe various reported or observed symptoms of mental function/action/experience/structure.

Then we can ask intelligent questions about various items in the category, which we might be able to actuality define, and thus make testable predictions about.
...
Now one way of asking these questions is really purposeless, and that is to ask if a given thing has some particular subjective experience or not. That question isn't answerable in any way other than giving into our own sense of plausibility. (Plausibility often misleads us about truth.) Basically you either "like" that idea or you don't.
 
  • #43
Quatl said:
If you accept the actual definition that society uses for consciousness, then the answer is no. It seems to me that most folks define consciousnesses: "The sum of all mental processes and experiences that only humans possess."
I've never heard of such a strange way of defining consciousness.
It is not human-centric and it is not the sum of experiences.


A much more appropriate way of defining it IMO is the degree to which an entity is able to distinguish itself from its surroundings, or self-recognition - to have the concept of "me".

Very illuminating experiments have been done putting critters in front of a mirror. Conscious critters are able to recognize themseelves in a mirror. Because of the way the experiment iis set up, it can be shown that they do not simply recognize "another" like themselves, they recognize that that critter is "me", and those other ones are not me.

Humans, dolphins and chimps show this ability.

Dogs and cats (and human babies while we're at it) show this abiltiy to a much smaller degree, suggesting they they are just in the twilight of consciousness.
 
Last edited:
  • #44
Meatbot said:
What if you have 10 foot neurons made of wood with mechanical clockwork innards, and they shoot different types of metal balls to each other to send messages. What if they have some mechanical method for reproducing every other function of a neuron? What if they are arranged in exactly the same way the neurons in the brain are? Will that group of wooden neurons be conscious?

What if you take a group of 100 million people and have each one perform the duties of a neuron? Will that system be conscious?

Seems to me you would have to say that it would be conscious, as strange as that sounds.
Yes, agreed. The fact that it does sound strange explains precisely why it is counter-intuitive, which also explains why so many people (supporters of Searle's Chinese Room argument and Ned Block's Chinese Nation argument) reject the idea as nonsensical - they simply do not have the mental capacity to overcome their limited imaginations.

Moridin said:
The China brain does create a mental state. This is yet another one of those intuition pumps that fall to bits when analyzed, just like the Chinese room. Our intuition that it is impossible is just a bias against non-neuron minds, furthered by the implausibility by the scenario. There is a natural desire to locate the mind at a specific point, because the mind feels like one thing.
Agreed.

DaveC426913 said:
I've never heard of such a strange way of defining consciousness.
It is not human-centric and it is not the sum of experiences.


A much more appropriate way of defining it IMO is the degree to which an entity is able to distinguish itself from its surroundings, or self-recognition - to have the concept of "me".

Very illuminating experiments have been done putting critters in front of a mirror. Conscious critters are able to recognize themseelves in a mirror. Because of the way the experiment iis set up, it can be shown that they do not simply recognize "another" like themselves, they recognize that that critter is "me", and those other ones are not me.

Humans, dolphins and chimps show this ability.

Dogs and cats (and human babies while we're at it) show this abiltiy to a much smaller degree, suggesting they they are just in the twilight of consciousness.
I think consciousness must be more than simply the ability of self-recognition. We could program a simple machine (equipped with video camera) to "recognise itself" when it views itself in a mirror, but it does not follow that such a machine would necessarily possesses consciousness. Conclusion - be careful about jumping to the conclusion that an animal (or even a human baby) possesses consciousness (as we understand it) simply from the fact that they are able to recognise themselves.
 
  • #45
DaveC426913 said:
A much more appropriate way of defining it IMO is the degree to which an entity is able to distinguish itself from its surroundings, or self-recognition - to have the concept of "me".

Very illuminating experiments have been done putting critters in front of a mirror. Conscious critters are able to recognize themseelves in a mirror. Because of the way the experiment iis set up, it can be shown that they do not simply recognize "another" like themselves, they recognize that that critter is "me", and those other ones are not me.

Is a cockroach a conscious organism? Is a snake?
 
  • #46
Shooting star said:
Is a cockroach a conscious organism? Is a snake?
I would say no.
 
  • #47
moving finger said:
I think consciousness must be more than simply the ability of self-recognition. We could program a simple machine (equipped with video camera) to "recognise itself" when it views itself in a mirror, but it does not follow that such a machine would necessarily possesses consciousness.
Point taken.
 
  • #48
DaveC426913 said:
Humans, dolphins and chimps show this ability.

Dogs and cats (and human babies while we're at it) show this abiltiy to a much smaller degree, suggesting they they are just in the twilight of consciousness.

You are giving examples of what we call very intelligent life. And they need not be so, as moving finger has pointed out.

If snakes and cockroaches are not conscious according to you, give a few more examples so that we may be able to understand, not what is generally accepted to be meant by consciousness, but your notion of consciousness. This is how ultimately people communicate, so don’t think this is a challenge of some kind.

Always, invariably, in a discussion of this sort, concepts get mixed up and shift meanings, the notable ones being:

Consciousness
Display of consciousness
Self awareness
Theory of mind
Intelligence
(and ultimately)​
Life.
 
  • #49
I forgot to mention another nemesis:

Freewill.
 
  • #50
Shooting star said:
You are giving examples of what we call very intelligent life. And they need not be so, as moving finger has pointed out.

If snakes and cockroaches are not conscious according to you, give a few more examples so that we may be able to understand, not what is generally accepted to be meant by consciousness, but your notion of consciousness. This is how ultimately people communicate, so don’t think this is a challenge of some kind.
Well, I guess my definition of consciousness revolves around whether an organism is capable of conceiving of the concept of "me".

My earlier suggestion about looking in a mirror is not a defintion of consciousness (wasn't meant to be), but an indicator. We can't know what any animal is actually thinking, so we do experiments to see what they might be thinking. It's behaviour indicates that it is capable or understanding itself as a distinct entity. I would not say that a flight-survival instinct or a "recoil from pain" reaction constitutes an organism understanding itself as a distinct entity.

As for the robot designed to recognize itself, we can know what they are thinking (which is: nothing) so we know that, in this case, our test gives a false positive. That's OK.


I'd say a human baby (<6 months) is arguably not conscious. If he pokes himself in the eye with his own toe, he doesn't even know it was a part of his own body. It takes him months to understand himself as an independent entity.


If snakes and cockroaches are conscious then IMO that drastically dilutes the definition of the word. We'd now have a hard time defining a difference between "life" and "consciousness", which makes it kind of useless.

A counter-question: What lifeforms are not conscious?
 
Last edited:
Back
Top