Can Everything be Reduced to Pure Physics?

  • Thread starter Thread starter Philocrat
  • Start date Start date
  • Tags Tags
    Physics Pure
Click For Summary
The discussion centers on the claim that everything in the universe can be explained solely by physics. Participants express skepticism about this assertion, highlighting the limitations of physics and mathematics in fully capturing the complexities of reality, particularly concerning consciousness and life. The conversation touches on the uncertainty principle, suggesting that while physics can provide approximations, it cannot offer absolute explanations due to inherent limitations in measurement and understanding.There is a debate about whether all phenomena, including moral and religious beliefs, can be explained physically. Some argue that even concepts like a Creator could be subject to physical laws, while others assert that there may be aspects of reality that transcend physical explanations. The idea that order can emerge from chaos is also discussed, with participants questioning the validity of this claim in light of the unpredictability observed in complex systems.Overall, the consensus leans towards the notion that while physics can describe many aspects of the universe, it may not be sufficient to explain everything, particularly when it comes to subjective experiences and the nature of consciousness.

In which other ways can the Physical world be explained?

  • By Physics alone?

    Votes: 144 48.0%
  • By Religion alone?

    Votes: 8 2.7%
  • By any other discipline?

    Votes: 12 4.0%
  • By Multi-disciplinary efforts?

    Votes: 136 45.3%

  • Total voters
    300
  • #301
vanesch said:
I think this is THE fundamental, difficult issue, if any, that the physical sciences have to consider, if they want to claim (as I think they should do) to understand everything in this world, at least in principle.

Oh come now. Don't you realize that this is just being obstinate and silly? It's obvious that everything behaving like me is conscious. :smile:
 
Physics news on Phys.org
  • #302
vanesch said:
If, as said before, we proceed by induction, and we assume that, because other people look and behave like myself, then we can reasonably take as a working hypothesis that most other people, in their awake state, also must have "conciousness". All that is allright. But it didn't tell us at all *what* is conciousness, and how we can measure, observe, do anything with it. Yes, we can measure brain processes. We can correlate certain brain activities with awake people, and assume they have something to do with conciousness.

I want to jump in and say that I have never argued with this. Solving the problem of what consciousness is and how it is generated is another question entirely from the only question I meant to address. The only point I was ever trying to make is that consciousness is not, in principle, excluded from consideration as a naturally selected trait because it has no obvious evolutionary value.

A related question is: are higher animals conscious ? I know that biologists seem to think they aren't, but how can they know ?
Ignoring the existence of conciousness, as some seem to do, is no solution. I am concious. Conciousness thus exists.

Actually, biologists assume that higher animals are conscious. In order for consciousness to be an evolved trait, like any other trait, it must have arisen gradually through many species to eventually emerge in the form which we experience as conscious humans. If biologists didn't believe higher animals to be conscious, then researchers at Cal Tech and USC would not be performing consciousness studies on non-human primates.
 
  • #303
Fliption said:
Oh come now. Don't you realize that this is just being obstinate and silly? It's obvious that everything behaving like me is conscious. :smile:

No, it isn't, and if you are referring anything I ever said, you have misinterpreted. I have said that it is obvious that other humans are conscious, not that any machine or creature that behaves like a human is conscious. If you can't see the difference, so be it. Just realize that the position you are mocking is not my position.
 
  • #304
loseyourname said:
No, it isn't, and if you are referring anything I ever said, you have misinterpreted. I have said that it is obvious that other humans are conscious, not that any machine or creature that behaves like a human is conscious. If you can't see the difference, so be it. Just realize that the position you are mocking is not my position.

The inference that you make about humans is not exempt from the point that Vanesch is making, regarding our inability to identify consciousness. You are merely appealing to common sense. What I mock is the use of common sense as an exception to a real philosophical issue.
 
  • #305
And I don't think questioning whether or not other humans are conscious is a "real" issue. That is all. We disagree. I have people on my side, you have people on your side. All are intelligent people. So be it.
 
  • #306
vanesch said:
we assume that, because other people look and behave like myself, then we can reasonably take as a working hypothesis that most other people, in their awake state, also must have "conciousness".

It may be a bit difficult to understand why, but it's perfectly possible to know that other people are conscious. It's not an assumption, not a working hypothesis, but a well-established fact. If for some reason you find out that other people do not have what you call consciousness, it only means you didn't know what the word "consciousness" means. You learned it from other people, and they couldn't tell you they have something only you have. They would not have a word for it.

But it didn't tell us at all *what* is conciousness, and how we can measure, observe, do anything with it.

It is wrong to think that we have a "problem of consciousness" just because we don't know what consciousness is. We don't know what most things "are" if taken in that sense. If we have a problem of consciousness because of that, then we also have a problem of space, a problem of time, a problem of object, a problem of language, a problem of numbers, a problem of intelligence... the list is endless!


But can we make one day a machine that is concious?

If we make a machine that appears to be conscious, we won't think it is conscious because we already know how to account for its behavior using other concepts. We need the concept of consciousness to explain why people do some things they do; we don't need the concept to explain how computers work, we already have simpler concepts like boolean logic and electricity.

We cannot hide our ignorance anymore behind the simple induction of: "I'm concious, he looks like me, so probably he's conscious too."

When your computer says "You have mail", you don't think it's conscious, do you? That is because you fully understand why it is doing it without having to invoke a conscious entity inside the computer. It's misleading to think we will one day build a machine so complex we won't understand it, and will be forced to come up with a concept like consciousness to explain its behavior. The reason is simple: we can't build a machine we don't understand. In fact, it's hard enough to build machines we fully understand.

I take it that my PC is not conscious (but HOW do I really know that ?).

You really know that because, if you claim your computer is conscious, an electrical engineer can explain to you what you mean by "conscious" in very precise terms (the same terms an engineer uses to build the computer in the first place). The concept of consciousness will then become a superfluous idea which can be stated in more precise, simpler terms.

A related question is: are higher animals conscious ? I know that biologists seem to think they aren't, but how can they know ?

As far as humans go, consciousness implies the existence of language abilities. If you can communicate using symbols, then you're conscious; if you're not, then it's impossible to know.

To answer the question about animals, until we discover if they use language (they might; after all they make a lot of noises), we have no way to know if they are conscious.

Ignoring the existence of conciousness, as some seem to do, is no solution. I am concious. Conciousness thus exists.

I never heard of anyone who denies the existence of consciousness. All I know about are people who don't subscribe to certain metaphysical views, but I never saw anyone claiming humans are not conscious.
 
  • #307
Egmont said:
It is wrong to think that we have a "problem of consciousness" just because we don't know what consciousness is. We don't know what most things "are" if taken in that sense.

Not really. We have a quite accurate working definition of, say, an ethanol molecule. We have techniques to measure the amount of ethanol in a given liquid. We don't have anything comparable for conciousness.

If we make a machine that appears to be conscious, we won't think it is conscious because we already know how to account for its behavior using other concepts.

That's a bit silly: you reduce conciousness to a behavioural mystery. If there is no behavioural mystery, then there cannot be conciousness. So that would mean that people who's behaviour is perfectly predictable aren't conscious (so they don't feel pain when we hit them :Devil:)


When your computer says "You have mail", you don't think it's conscious, do you?

I don't know. If YOU tell me that I have mail, do I conclude from that that you are / aren't conscious ? I don't think that behaviourism can indicate the presence or absence of conciousness. It is not impossible to think of machines that do what many people do, a lot of their time.

You really know that because, if you claim your computer is conscious, an electrical engineer can explain to you what you mean by "conscious" in very precise terms (the same terms an engineer uses to build the computer in the first place). The concept of consciousness will then become a superfluous idea which can be stated in more precise, simpler terms.

Again, you seem to equate conciousness with "cannot be explained". So some parts of my car are concious, until I find out how it works ?

As far as humans go, consciousness implies the existence of language abilities. If you can communicate using symbols, then you're conscious; if you're not, then it's impossible to know.

Again. That's not what conciousness is about. Computers talk to each other over a network. Ants talk to each other using pheromones. Is "establishing a communication protocol and link" the same as "conciousness" ?
You know in your bones that that's not it. It is this "awareness of existence" together with what can be qualified as "feelings" we're talking about.
Is a human being who lost his linguistic abilities by accident not a conscious being anymore ?

To answer the question about animals, until we discover if they use language (they might; after all they make a lot of noises), we have no way to know if they are conscious.

We have no way, indeed. But linguistic abilities are not the same thing as conciousness. And, we might one day make machines that mimick so well ordinary human conversation, that you will be tricked into thinking they are concious. But all this is behaviourism. We're missing the point.


cheers,
Patrick.
 
  • #308
More Questions About Consciousness


1) What is the BENEFIT of People installed in separate visual perspectives or spatiotemporal frames of reference feeling the same pain or seeing the same patch of red colour and knowing what it is like to do so? What will we now or in the end do with this sort of knowledge?

2) Does consciousness have VISUAL PRIORITY LEVELS extending outward and inward? For example, there are many physical events and actions in the human body that the body is already able to carry them out automatically that are claimed not to have any relations to consciousenss. If there are visual priority levels, ought we not to argue that somethings or events or actions in our body are given more visual attention than others, that things that the body is capable of doing without much visual attention are gracefully but naturally deprioritised? Could we not argue that despite the fact that they are visully deprioritised things or events, or actions nevertheless they are still conscious? And that at the outer higher conscious-level, more visual attentions are given to LIVE-CRITICAL things and events from within and from without? This question has serious implications on the need to reconcile 'SEQUENTIALISM' with 'SIMULTANEITY' in the overal mental and physical processes. Personally, I keep a very open mind on this, but if at all we still believe that there is a clear distinction between mental states and physical states, the pulling tension between Sequentialism and Simultaneity is a problem that plagues the two on equal terms. And I think that this problem has a connection with why we think of visually deprioritised things and events as unconscious or visually unattended, or even with why we created the gulf between physical states and mental states in the first place.

3) What PHYSICAL SIZE must anything have before it can qualify to possesses or have consciousnes? Does size matter? I am asking this question because in nearly every discipline, we tend to erroneously believe that only the human form of a given or well-established 'SIZE' is capable of conscious existence. This now automatically invokes my next question.

4) Does Consciousness come in DEGREES or in different GRADES? Obviously this question depends on our answer to (3), or simply on accepting that different kinds of biological organisms, regardless of their forms, sizes, time scales or modes of existence, can possesses consciousness of some sort in the first place. What I am getting at is this: couldn't we just accept and say that other organisims may or do possesses consciousness but only that this comes in different grades or degrees? If this were to be the case we ought to then argue that consciousness has a unique purpose and that coming in different grades or categories it oughts to be structurally and fuctionally improveable towards some sort of structrual and functional 'perfection'. However, even if, this still leaves one outstanding question to be cleard up: If consciousness in the end survives destruction with the human in which it is a part, which aspect of it will remain, let alone be relevant?

5) Is 'COLLECTIVE CONSCIOUSNESS' or 'GLOBAL CONSCIOUSNESS' or 'UNIVERSAL CONSCIOUSNESS' or 'POOLED CONSCIOUSNESS' possible? If it exists will this solve the 'Qualia Problem'? I ask this question, because there are some scientists who are already attmepting to take it from the realm of paranormalism into the realm of 'science'. I will provide some postings later on this.
 
Last edited:
  • #309
loseyourname said:
The only point I was ever trying to make is that consciousness is not, in principle, excluded from consideration as a naturally selected trait because it has no obvious evolutionary value.

I understood that by reading the previous posts in this monster thread. I have no hard opinion on the issue, but would like to raise a point. If something is to have evolutionary value, is it not more intelligence than conciousness ? Which then links to another question: is conciousness necessary for intelligence ? Personally, I don't think so, but I have no strong arguments. Intelligence is much more "measurable" than conciousness, and I think, in 20-30 years, we will have very intelligent machines, intelligent in the sense that they can do lots of "smart things". But to me, conciousness is that "other" aspect of our being, namely, as I wrote earlier, our "awareness", our "feelings", the fact that "pain hurts". I know this is vague, but that's exactly the problem !
There is something ethical about conciousness, because "pain hurts". Once we know that certain machines would be concious, I think ethically they should have fundamental rights, such as the right for not being tortured or so.
On the other hand, I don't think you can do unethical things to your PC. But maybe one day I will stand on trial because I made a big, conscious computer in the basement, which I then tortured during years because I'm a perverted lunatic :-)

cheers,
Patrick.
 
  • #310
vanesch said:
If something is to have evolutionary value, is it not more intelligence than conciousness ?

I said the above, because I'm quite convinced that conciousness has nothing to do with behaviourism, in that conciousness _observes_ but does not necessarily _act_. If the acts come from intelligence, then the behaviour (and hence the survivalistic value) of an animal with or without conciousness, but with the same intelligence should be the same.

cheers,
patrick.
 
  • #311
is this a thread?
 
  • #312
vanesch said:
We have a quite accurate working definition of, say, an ethanol molecule. We have techniques to measure the amount of ethanol in a given liquid. We don't have anything comparable for conciousness.

That's not true. It's a lot easier for anyone to determine if a person is conscious, than it is for chemists to determine if a certain liquid contains ethanol. We don't have instruments to measure consciousness simply because we don't need them.

That's a bit silly: you reduce conciousness to a behavioural mystery. If there is no behavioural mystery, then there cannot be conciousness.

It's not silly but, as I said, it may be difficult to understand why. At least it was difficult for me. I will try and elaborate.

If you think about our knowledge, it consists primarily of two types of entities: things we can directly observe, and things we postulate to exist which cannot be directly observed. The former are what we call entities in the physical universe; the latter are called many different names: laws, principles, forces, and so on. This immediately begs the question: why do we need to postulate the existence of things which cannot be observed? Why not simply stick to what we observe?

The reason we come up with unobservable entities is quite simple: they are implied to exist as an explanation of what we observe. Here is an example: we observe objects falling to the ground all the time, but we don't understand why. Somehow it seems the fact that objects fall must necessarily imply the existence of some entity which is causing them to fall. As we all know, that entity is now known as "gravity", but it's an often ignored fact that the only way we can observe gravity is by observing its effects. To this day, no one has asserted the existence of gravity as an entity which exists on its own; in fact that is just impossible, given that the absence of observable effects implies the absence of gravity itself (one could say we have a "problem of gravity")

Back to consciousness. It certainly is not part of the physical universe, as it can't be directly observed. "Consciousness" is a postulate, an entity which we conceive of in an attempt to explain certain aspects of human behavior. And that being the case, like gravity, consciousness cannot be postulated to exist in the absence of observable effects.

So that would mean that people who's behaviour is perfectly predictable aren't concious

No, that is not what it means. The issue here is, how do we explain a certain behavior? Sometimes you have to invoke the unobservable entity called "consciousness", sometimes you can explain behavior invoking simpler, more intuitive concepts. The latter alternative is always preferrable.

It is not impossible to think of machines that do what many people do, a lot of their time.

Machines already do a lot of what people do, often better, yet no one seriously consider them to be conscious. The reason, as I said above, is simply because we have simpler, more intuitive explanations.

Again, you seem to equate conciousness with "cannot be explained". So some parts of my car are concious, until I find out how it works ?

Nope. If you are an educated person, you know that an explanation for the workings of you car exist, and you know that it doesn't involve consciousness. On the other hand, if you are an ignorant person, you may be tempted to do exactly what you said: invoke the existence of conscious entities that cause your car to behave the way it does. That's why primitive peoples tend to believe the world is filled with "spirits" - they lack better explanations, that's all.

You know in your bones that that's not it. It is this "awareness of existence" together with what can be qualified as "feelings" we're talking about.

In all honesty, I have no idea what "awareness of existence" means, so I can't really comment on that. As to "feelings", mine are always associated, without exceptions, with bodily sensations, so it's possible they are the same thing.

linguistic abilities are not the same thing as conciousness.

I didn't say they were the same thing, I said the only way to establish the presence of consciousness for sure is by checking for linguistic abilities. That is because most other behaviors can be explained without invoking the concept of consciousness, but language cannot.

And, we might one day make machines that mimick so well ordinary human conversation, that you will be tricked into thinking they are concious.

As I said, primitive peoples are often tricked into thinking inanimate objects are conscious. The only thing that may cause a human to think a computer is conscious is ignorance of the way the computer works.
 
  • #313
Egmont said:
As I said, primitive peoples are often tricked into thinking inanimate objects are conscious. The only thing that may cause a human to think a computer is conscious is ignorance of the way the computer works.

Ok, we have very different definitions of the word "conciousness" then.
Concerning your definition of what is part of the physical world (can be directly observed) and what are "explanatory concepts" (all the rest), I'm affraid that you also have a very different definition of "the physical world", a very restrictive one. You seem to say that the physical world contains only those entities that can directly influence your senses. Visible light exists, but UV light is an "explanatory concept". Rain drops exist, but water molecules are an explanatory concept. Indeed, if you reduce the physical world to this definition, then it is probably hopeless to talk about conciousness, which, if anything, you will for sure classify as an explanatory concept, together with gravity, EM waves, atoms, molecules, black holes, neutrinos and extrasolar planets.

cheers,
Patrick.
 
  • #314
Egmont said:
In all honesty, I have no idea what "awareness of existence" means, so I can't really comment on that. As to "feelings", mine are always associated, without exceptions, with bodily sensations, so it's possible they are the same thing.

Yes, they are the same thing. That's part of my definition of conciousness. Does a human being have "bodily sensations" ? (yes) Does a monkey ? (highly probable) Does a rabbit ? (probably, I don't know) Does a mouse ? (probably, I don't know) Does a bird ? (probably, I don't know) Does a bee ? (don't think so, but maybe). An ant ? (don't think so) A microbe ? (don't think so) A virus ? (nope) An embryo ? (depends on the age ?) A computer ? (not for the moment) A brain ? (?)
Will it be possible, one day, to make a machine (not necessarily in silicon, might be molecular or biomolecular, but fully artificial) which is of the same complexity as an ant ? As a bee ? As a bird ? As a mouse ? As a rabbit ?
Do you see where I am aiming at ?

Also, you say that you have no idea of your awareness of existence, even when you are awake. You must not be serious when you claim that ! You are not aware that you exist ?

I have difficulties pinning down exactly my "definition" of conciousness, but that is not my fault, it is because it is a very slippery subject. But let me try to work with some of its properties. Probably the closest I come to conciousness is that "pain hurts". You can define "pain" in a behavioural way, in that an entity "tries to avoid" stimuli which are "painful". But that's of no use in itself: I can train a robot to run away from, say, a green ball, and then we should conclude that the view of a green ball is painfull for the robot.
You could also define something of the style that pain is what is conceived, correctly or incorrectly, by an entity, as something that is damageable.
But that's also not true: I can put you in an intense neutron beam, and you won't feel anything. However, you'll probably devellop a serious cancer withing the next months or years.
Conciousness is present when pain also hurts. If I hit your foot with a hammer, and when the nerve signals reach your brain, it hurts, then you are concious. The opposite is not necessarily true: it is not because you don't feel pain that you aren't concious, but if you DO feel pain, then you ARE concious. I don't think that a tree can feel pain. I'm not sure if an ant does, or a bee. A mouse, not really sure, it might be purely behavioural (a behavioural response to a stimulus which we associate with pain, by analogy, if "they would do that to us"), but you get the idea that a mouse might feel pain. A monkey, I think it does feel pain. I'm pretty sure you feel pain.

cheers,
Patrick.
 
Last edited:
  • #315
vanesch said:
Ok, we have very different definitions of the word "conciousness" then.

Who doesn't?

Concerning your definition of what is part of the physical world (can be directly observed) and what are "explanatory concepts" (all the rest), I'm affraid that you also have a very different definition of "the physical world", a very restrictive one.

I don't find it restrictive at all. There's plenty of stuff around me that I can see, hear, touch, smell, taste.

You seem to say that the physical world contains only those entities that can directly influence your senses.

Pretty much.

Visible light exists, but UV light is an "explanatory concept".

Actually, "light" is also an explanatory concept.

Rain drops exist, but water molecules are an explanatory concept.

Exactly. But you can't overlook an important aspect: so long as we don't have a better explanation, we are forced to acknowledge that "water molecules" must exist even if we can't observe them. At the same time, to assert that we know that water molecules exist with the same certainty we assert that rain drops exist is a bit of an oversight, in my opinion.

Indeed, if you reduce the physical world to this definition, then it is probably hopeless to talk about conciousness, which, if anything, you will for sure classify as an explanatory concept, together with gravity, EM waves, atoms, molecules, black holes, neutrinos and extrasolar planets.

I don't find it hopeless. Keeping in mind that consciousness is an explanatory concept helps understand why it generates so much heated debate. As you certainly are aware, asserting the existence of explanatory concepts can be troublesome at times. So long as there are alternative explanations, the world will always be filled with people who deny that consciousness exists. I don't think I'm one of these people because, as with molecules and UV rays, we are forced to accept the existence of consciousness because it is the best explanation for human behavior. I think I only disagree that we know for sure it exists because we "experience" it. The same reasoning could be used to assert the existence of a lot of stuff, like ghosts and lake monsters.
 
  • #316
vanesch said:
I understood that by reading the previous posts in this monster thread. I have no hard opinion on the issue, but would like to raise a point. If something is to have evolutionary value, is it not more intelligence than conciousness ? Which then links to another question: is conciousness necessary for intelligence ? Personally, I don't think so, but I have no strong arguments. Intelligence is much more "measurable" than conciousness, and I think, in 20-30 years, we will have very intelligent machines, intelligent in the sense that they can do lots of "smart things". But to me, conciousness is that "other" aspect of our being, namely, as I wrote earlier, our "awareness", our "feelings", the fact that "pain hurts". I know this is vague, but that's exactly the problem !

A couple things about this. First, remember that a trait does not necessarily need to be of value to be fit into an evolutionary framework. There are other mechanisms I've spoken of by which relatively superfluous traits may evolve. That said, I do think that consciousness is of value. If you go back to my example of functionally conscious machines - that is, a machine that is capable of performing all of the functions that a conscious human can, but without having any subjective experience, you'll see a point I made about computing power. The ability to strategize holistically seems to still be fairly lost on computers, as evidenced by the fact that we have yet to design a program that can defeat a human skilled at the game of Go. I'm sure there are other examples, but that is the best one I can think of. A program that would be capable of doing this; that is, strategizing holistically about this very complex game at least as well as a human can, would require even more computing power than is currently available - but the most powerful computers already have far more computing power than does the human brain. So it is think it is clear that being conscious gives us the ability to economize - to perform these functions with minimal computing power. Without consciousness, I'd imagine our heads would be much larger and use up far more oxygen and ATP than they already do, which in and of itself is a disadvantage.

There is something ethical about conciousness, because "pain hurts". Once we know that certain machines would be concious, I think ethically they should have fundamental rights, such as the right for not being tortured or so.
On the other hand, I don't think you can do unethical things to your PC. But maybe one day I will stand on trial because I made a big, conscious computer in the basement, which I then tortured during years because I'm a perverted lunatic :-)

You know, I hadn't even thought of that. There is clearly an advantage in having members of a species instilled with a certain sense of ethics, and that cannot be achieved unless individuals of that species are capable of "feeling" subjectively.
 
  • #317
Egmont said:
I don't think I'm one of these people because, as with molecules and UV rays, we are forced to accept the existence of consciousness because it is the best explanation for human behavior. I think I only disagree that we know for sure it exists because we "experience" it.

My opinion is exactly the opposite. Consciousness (I realize I've been misspelling the word all along ! English is not my mothertongue...) in the way I define it has absolutely no explanatory power, and - as I tried to point out - I think there is no link between consciousness and behaviourism. Probably if we will know the brain functions much better, the behaviour of a human being can be perfectly well explained as a function of these brain processes. So I don't need the concept to explain observations... except for the very observation that *I* am conscious (and by analogy, I assume you are too). It is not because a highly advanced medical science will be able to tell me that when you hit my foot with a hammer, a Sodium-Potassium ion density wave will propagate along the nerves, and then will release receptors in this part of the brain, and so pulses will go in that part of the brain, and that this will activate my voice control, and that I will pronounce the word "Aaah", and that other pulses will go to the muscles of my arm and try to grab the hammer out of your hand, and still other pulses will go to other muscles in my leg and my eyes will look at you and I will try to kick your bottom with my foot etc... that I will stop feeling the pain ! So if there is ONE thing I know for sure is that I am conscious, and I don't need that as an explanatory concept to explain my behaviour, I feel it.
In that view, the certified existence of consciousness is more a problem for the physical sciences if they want to explain it, than it is an explanatory tool, which, in my eyes, it isn't at all. If ever the physical sciences will have to admit that they cannot explain the whole universe, and that there is not only room for religion, but even necessity, then the issue of consciousness will be central to it. But there is still hope, we are far from having discovered everything the physical sciences can explain, no matter what naive string theorists seem to think.

cheers,
Patrick.
 
  • #318
loseyourname said:
If you go back to my example of functionally conscious machines - that is, a machine that is capable of performing all of the functions that a conscious human can, but without having any subjective experience

Ah, at least someone who has about the same conceptual notion of consciousness as me ! I haven't seen your example, but the very fact that you consider (as I do) that you can have a machine that in every aspect behaves "as would behave a conscious being in the same context" and you consider the possibility that that machine might NOT be conscious means that you do not necessarily adhere to behaviourism (which I define as exactly the opposite: namely, if the behaviour is as "would behave a conscious being in the same circumstances", then it is conscious; which reduces "consciousness" to a set of behavioural patterns and denies anything else it might mean).

But if "conscious-like behaviour" and consciousness are not equivalent in meaning, then this also means that things that might not at all act "conscious-like" can be conscious, and vice versa, and we're back to my central problem: except for oneself, of which one KNOWS one is conscious, and except for other human beings, which we can assume they are conscious by analogy, how the hell are we going to find out if an entity is conscious or not.

Now that we've put behaviourism aside (a good thing, in my opinion), there are two possibilities left open:

1) consciousness can influence behaviour. That means that consciousness somehow acts upon the physical world. This is a difficult-to-accept notion if we look into the functioning of a conscious being (the only one we know of for sure is a human, actually, only white, male humans, because I'm not sure that people of aziatic or african origin are conscious, or whether women are conscious, because I'm a white male - no, just kidding :-))) because we might find a complete behavioural description of the conscious being without needing any interference in the physical laws gouverning that behavioural description (except if we accept that a conscious being can only be based upon a quantum phenomenon, in which the fundamental uncertainties of quantum theory give the room of action to a consciousness to intervene in the behaviour ; Roger Penrose takes this very seriously).

2) Consciousness is a passive observer within a conscious being, whose behaviour is entirely determined by physical processes that are happening in the physical construction making up the body of the conscious being. Our consciousness just undergoes, without any influence, what the body does, all by itself. In this case it will be very hard to determine whether something else than a human being is or isn't conscious.



The ability to strategize holistically seems to still be fairly lost on computers, as evidenced by the fact that we have yet to design a program that can defeat a human skilled at the game of Go.

I'm not convinced by that. It is not because we haven't yet found out HOW to build efficiently such a machine, that it cannot be done. The question will then be if the machine is, or isn't, conscious !


but the most powerful computers already have far more computing power than does the human brain.

I would be surprised by this. If some parts of the brain are quantum computers, then the computing power of the brain might be tremendiously higher than by considering that neurons are the equivalent of logic gates, as is usually done in this kind of comparison.

cheers,
Patrick.
 
  • #319
Neurons, even without quantum consideration are closer to analog cpus than to logic gates. They can have many inputs with variable and conditional outputs, just as if running a program.
 
  • #320
vanesch said:
So if there is ONE thing I know for sure is that I am conscious, and I don't need that as an explanatory concept to explain my behaviour, I feel it.

But how exactly do you know that whatever it is that you feel is actually called "consciousness"? Think about it, you learned what the word means from other people, but you never had a chance to observe how they feel when they say they are conscious. Do you think it's possible that you associated the word "consciousness" with the wrong feeling? How can you be sure that you experience what other people describe as "conscious" if you have no way of experiencing what they do?

Those are very valid questions, and they clearly show that no one can be sure they are conscious just because they feel it. The reason we know for sure we are conscious is because we behave the same way as people who claim to be conscious behave. It's not because we feel the same way, for we have no idea how they feel.

If ever the physical sciences will have to admit that they cannot explain the whole universe, and that there is not only room for religion, but even necessity, then the issue of consciousness will be central to it.

Well, anything physical can be explained in purely physical terms. To say that science must incorporate religion is equivalent to saying economics must incorporate chemistry, or that biology must incorporate astronomy. Our knowledge is segmented, and religion provides knowledge about questions which lay completely beyond the domain of science.

But there is still hope, we are far from having discovered everything the physical sciences can explain...

I used to think that witnessing a miracle (supposing they happen) could convince any die-hard skeptic that materialism is a false ideology. I later realized I was wrong: the world is full of miracles, they happen everyday right in front of our eyes. Materialism is simply the mistaken notion that simply because something happens everyday, then there's nothing special about it. If the sky was filled with a different poem everyday, materialists would still claim it is perfectly "natural".

I think it's better just to ignore the materialists; they are a small minority anyway.
 
Last edited by a moderator:
  • #321
It seems I overlooked this sermon I got.

loseyourname said:
All I am doing is going out on a limb here to propose that p-consciousness is epiphenomenal, and I've also proposed a way to test this hypothesis. The hypothesis proposes that not only is p-consciousness efficacious, but it provides a clear advantage for a human organism. The test is not meant to be proof, which seems to be your quabble.

My quabble is simply that I interpreted your original comments to be one that denies the hard problem. In the quote above you at least admit that you are going out on a limb and making some assumptions. I see nothing wrong with doing this btw. It's the only way to make progress in the real world. If we allow philosophy questions to stop us from acting, we'll never accomplish anything. But this quote above is very different from the original message I got earlier in the thread. Those statements sounded more like statements of fact and anyone who disagrees is just obviously not thinking very hard about it. It seemed to totally ignore the philosophical issues or even deny them altogether.

The reason I tell you to calm down is that you seem to be getting rather exacerbated and you're beginning to be a tad bit insulting.

I suppose I don't respond well to being called obstinate and silly when I can point to many people who agree with my position. Frustration is just a natural response when I perceive someone is being a bit egocentric.

When an answer would further my understanding of another's position, it is not irrelevant, even if that other does not consider it pertinent to the point he is making. A good discussion is facilitated by open lines of communication. Even if you don't see the importance of a particular question or example, it is best to address it. Answering questions will always further a discussion, whereas questioning questions only causes it to go in circles.

There is no hard and fast rule for communication. What you say is sometimes true but other times it is a waste of time. The questions you asked were irrelevant. They still are. If I answer them, it only serves to further lead you down the wrong path by giving you the impression that they are relevant. My point should have been obvious after I posted the other thread. Instead, the posting here still continued seemingly without a clue what I was talking about and with no reference to the other information. So I can understand frustration.

Another thing - please don't bemoan what you find as the deplorable behavior of scientific minded persons on a metaphysics forum. Address my arguments and my examples and my questions. I am the only person you are having this discussion with. There is no need for you to point out that my position may be inconsistent with that of other posters.

I pointed out where other posters disagreed with you to attempt to somehow get across the idea that your views were not without their problems and that one does not need to be obstinate to point them out.

And it is an open forum, with many readers. I try not to forget that. I make observations, including those about behavior, because it just might draw someone else on the sidelines into the discussion. So you won't mind if I don't do what you ask.

Just as we must assume that human sense perception is basically reliable and that there is indeed an external world that we can have knowledge of, we must assume that other humans are indeed conscious.

This is why I still insist your questions were not relevant. You still think I'm talking about whether other humans are conscious. As I've said before, that is merely a byproduct of the real issue, which I was using as an illustration. I couldn't care less about solipsism.

This seems to be where we disagree. You think it is an important question that must be solved, I think it is not.

I hope you see now after this response that this isn't true. I fully understand why certain scientific researchers don't care about such things. But making an assumption so that work can be carried out is very different from totally denying that an assumption was made. Originally, I thought you were doing the latter of the two. Now it seems you are not. So I' don't think we actually disagree.
 
Last edited:
  • #322
Egmont said:
I think I only disagree that we know for sure it exists because we "experience" it.

How else would you know of it? How else would you know of anything? Experience is your only access to anything.

Also, help me understand your position a little more. Help me reconcile the quote above, which seems to be open to consciousness not existing, with this quote:

It's a lot easier for anyone to determine if a person is conscious, than it is for chemists to determine if a certain liquid contains ethanol. We don't have instruments to measure consciousness simply because we don't need them.

How can you determine that someone has something when you aren't sure it exists? In other words, how is it the word consciousness being used in language is proof that everyone is conscious yet it isn't proof that "something" exists we are referring to as consciousness?

Also, IMO proving solipsism wrong doesn't end the hard problem. That's like taking a boat to work after a major flood and then once you get to work saying "What flood? There is no flood. See, I got to work!"

The issue with the hard problem is that there is no functional explanation for what each of us refer to as consciousness. Whether other people are conscious are not is simply a byproduct of this situation. You may find alternatives around this particular byproduct(like taking a boat) but it doesn't eliminate the source of the problem(the flood).

Also, I will argue that this method of using language to make a claim about consciousness can only be used, at the very least, to argue that at least two people are conscious. That's all that's needed to establish the concept. Everyone else can pick up the word and attach it to whatever they want to and just believe they are referring to the same thing. The issue still stands in that you cannot know whether anyone person or thing is conscious. I may be the only person on the planet that isn't and you would never know because me not being conscious wouldn't impact the language at all.

But how exactly do you know that whatever it is that you feel is actually called "consciousness"? Think about it, you learned what the word means from other people, but you never had a chance to observe how they feel when they say they are conscious. Do you think it's possible that you associated the word "consciousness" with the wrong feeling? How can you be sure that you experience what other people describe as "conscious" if you have no way of experiencing what they do?

This is similar to the inverted spectrum scenario. This is a true situation we find oursleves in and fun to think about. But the only thing this explains is why there is so much confusion when people try to discuss things. It means nothing to the issues of consciosuness. I can, if I were a scientist, understand the materialistic principles by which I explain the nature of things around me. I can also understand that I have an experience of something that I refer to as "consciousness". So I can also see that these two don't connect and this creates a hard problem. I do not need to know what anyone else thinks consciousness is. I do not even need to have a word attached to my experience. I just need to notice it and establish it internally as a concept that needs explaining and then note that I can't explain it using the same materialistic techniques I've used to explain everything else.
 
Last edited:
  • #323
Egmont said:
But how exactly do you know that whatever it is that you feel is actually called "consciousness"?

Because that's what *I* call it ! But let us not do that, because apparently the word is already "occupied". Let us have instead a thread on the problem of knowing if an entity is "bewust". By "bewust" I mean the fact that an entity has subjective experiences and feelings, and that pain hurts. I know that *I* am bewust, it is the only thing I really know for sure. From analogy, I take it a reasonable assumption that other creatures around me, who look like me, and behave like me, should also be "bewust" ; at least some of them :wink: . This I don't know for sure, but I take it as an acceptable hypothesis. And now the question is: what exactly is causing this bewustness, and when can we know that something which DOESN'T look at all like a human being can have bewustness. As I explained above, it has nothing behavioural. Behavioural testing can test intelligence if you want, but not the fact whether or not the entity behaving that way has subjective experiences. So bewustness is not an explanatory concept for behaviour. We actually do not need the concept, if it weren't for the very fact that *I* know that I have subjective experiences and feelings.

And now that we have a new word for what I want to talk about, at least we won't be fighting with linguistic arguments about definitions ! I want to talk about the concept "bewust" as I defined above, not about how we should define an existing word. It's the concept, not the word, I'm interested in discussing.

I was assuming that what I defined above as "bewust" came pretty close to what people accept as the definition of conscious. But in order to be able to start reasoning around the same concept, I introduced a new, "virgin" word that should be free of any preconceptions that might differ from what I thought the word meant and hence bring in confusion in the arguments. The price to pay is that I will have to build up the entire definition of my new word myself.

So in order to add to my definition of "bewust", I can add the following: it is in general considered ethically a bad thing to inflict bad treatment on something that is supposed to have bewustness, much more so than just the consequences of material damage that might follow from this bad treatment, in the following way:
it is sometimes not considered "very nice" to cut down trees, but the reasons for that are that we like the sight of trees, that there are some ecological considerations and so on. But trees are cut down (for the wood and so on) and nobody is going to argue that at least, one should do it in a "quick and clean way".
But it IS strongly ethically disapproved to torture young children in your basement. The reason for that is NOT the loss in investment (it took time and money to make them and raise them) or that fact that if you don't kill them, they might turn into mentally desequilibrated persons or whatever... no, it is the very act of torture that is strongly disapproved. And this comes from the fact that we assume children to have bewustness.


cheers,
Patrick.
 
Last edited:
  • #324
Egmont said:
I think it's better just to ignore the materialists; they are a small minority anyway.

I think it is better to ignore mathematicians. They are a small minority anyway.

cheers,
Patrick.
 
  • #325
Fliption said:
I can, if I were a scientist, understand the materialistic principles by which I explain the nature of things around me. I can also understand that I have an experience of something that I refer to as "consciousness". So I can also see that these two don't connect and this creates a hard problem. I do not need to know what anyone else thinks consciousness is. I do not even need to have a word attached to my experience. I just need to notice it and establish it internally as a concept that needs explaining and then note that I can't explain it using the same materialistic techniques I've used to explain everything else.

You come to the essence of the point I was trying to make here. The only difference in opinion I might have is the following: I've not yet given up on scientific inquiry into consciousness. But we haven't even begun! Progress could be made, if one day, we could, say, link together two brains, or a brain and a conscious machine, and your conscious experience "merges" somehow with that of the other entity.

cheers,
Patrick.
 
  • #326
vanesch said:
You come to the essence of the point I was trying to make here. The only difference in opinion I might have is the following: I've not yet given up on scientific inquiry into consciousness. But we haven't even begun! Progress could be made, if one day, we could, say, link together two brains, or a brain and a conscious machine, and your conscious experience "merges" somehow with that of the other entity.

cheers,
Patrick.

Let me be more clear then, because I agree with this. I don't necessarily believe that we cannot one day find a place for consciousness and gain knowledge of it. I simply doubt that materialism can do it. Simply because it insists on doing it in a functional way and I don't believe that it can be reductively explained functionally. So this doesn't exclude science. It just means that there needs to be some assumption changes about what is fundamental in nature and what is emerging from those fundamental parts.
 
  • #327
Fliption said:
I simply doubt that materialism can do it.

I'm not well versed in philosophic terminology, so could you give me your definition of "materialism" ? I thought it meant that everything was explainable in physical laws, but probably it is more subtle than this.

cheers,
patrick.
 
  • #328
vanesch said:
Because that's what *I* call it !

OK, here's how I see it: when you claim you are sure you are conscious, at a minimum I can agree that there is at least one fact about which you have no doubt whatsoever. I can certainly agree with you that for each and every person, at least one fact is true beyond doubt. So for a while let's ignore what it is you are sure about, because it's easier to reach agreement that way.

But let us not do that, because apparently the word is already "occupied". Let us have instead a thread on the problem of knowing if an entity is "bewust".

Good, I like that. However...

By "bewust" I mean the fact that an entity has subjective experiences and feelings, and that pain hurts.

... this is unnecessary complication. You got rid of one problem by introducing a new concept - bewust. Now you introduced the problem back by implying that "bewust" is synonimous with "consciousness". Can we try a different approach?

I know that *I* am bewust, it is the only thing I really know for sure.

Now this is great. If you claim you know you are bewust, and that is the only thing you really know for sure, then no one can possibly have any issue with you. I certainly won't. The problem now is, how do I know what you mean by "bewust"? Well, I have a good starting point: "bewust" is something a person can be really sure they are. I can tell you right away that it doesn't map to my concept of "consciousness", because I don't know for sure if I am conscious, but it does map to something, because there are many things I am really sure I am.

Do you think this works?

From analogy, I take it a reasonable assumption that other creatures around me, who look like me, and behave like me, should also be "bewust" ; at least some of them

Sure, but you're only guessing and you may be wrong. Take me as a test - how can you know if I am "bewust" without guessing? You don't have to guess if it's raining, all you have to do is look outside the window. You don't have to guess if today is Thursday, all you have to do is look up a calendar. So how can you know I am "bewust" with the same level of certainty you know the weather or the day of the week?

This I don't know for sure, but I take it as an acceptable hypothesis.

What if I claim I am also "bewust" because, like you, there is one thing I really know for sure, and I wouldn't mind calling that thing "bewust", since I don't have a name for it. Do you think this works?

And now the question is: what exactly is causing this bewustness, and when can we know that something which DOESN'T look at all like a human being can have bewustness.

This is introducing problems which, at this point, are completely unsolvable. In a single sentence you came up with another concept, "bewustness", assumed it must have a causal relationship with another set of concepts, and started questioning whether things that cannot claim to really know for sure about one fact may be "bewust".

I don't deny your right to proceed that way, but I can guarantee you you will get absolutely nowhere. Trust me, millions of people already tried the same approach and they all failed miserably. Do you honestly think you can succeed?

As I explained above, it has nothing behavioural. Behavioural testing can test intelligence if you want, but not the fact whether or not the entity behaving that way has subjective experiences. So bewustness is not an explanatory concept for behaviour. We actually do not need the concept, if it weren't for the very fact that *I* know that I have subjective experiences and feelings.

Three paragraphs in your post and you completely lost me. This never gets anywhere; never did and never will.

I hope you don't think I'm being cynical or sarcastic - I'm not. What I'm trying to do is draw your attention to a problem that is so widespread, so ubiquitous, that very few people notice it.

And now that we have a new word for what I want to talk about, at least we won't be fighting with linguistic arguments about definitions

That happens to be exactly the ubiquitous problem I just mentioned. Nobody wants to have arguments about definitions, everyone assumes everybody else understands what they mean when they use a word, even as they know it's wrong to assume that.

I want to talk about the concept "bewust" as I defined above, not about how we should define an existing word. It's the concept, not the word, I'm interested in discussing.

You haven't given me a concept, you have given me a word - "bewust". And in order to try and convey the concept you associate with that word, you have given me... more words! How in the world can anyone think language doesn't play any role in this?

I was assuming that what I defined above as "bewust" came pretty close to what people accept as the definition of conscious.

... in which case, why do we need another word?

But in order to be able to start reasoning around the same concept, I introduced a new, "virgin" word that should be free of any preconceptions that might differ from what I thought the word meant and hence bring in confusion in the arguments. The price to pay is that I will have to build up the entire definition of my new word myself.
So in order to add to my definition of "bewust", I can add the following: it is in general considered ethically a bad thing to inflict bad treatment on something that is supposed to have bewustness, much more so than just the consequences of material damage that might follow from this bad treatment, in the following way:
it is sometimes not considered "very nice" to cut down trees, but the reasons for that are that we like the sight of trees, that there are some ecological considerations and so on. But trees are cut down (for the wood and so on) and nobody is going to argue that at least, one should do it in a "quick and clean way".
But it IS strongly ethically disapproved to torture young children in your basement. The reason for that is NOT the loss in investment (it took time and money to make them and raise them) or that fact that if you don't kill them, they might turn into mentally desequilibrated persons or whatever... no, it is the very act of torture that is strongly disapproved. And this comes from the fact that we assume children to have bewustness.

Oh man... do you realize you just can't do that? If defining a new word requires the usage of about a hundred other words, many of them without clear meaning (ethical? ecological? disapproved? mentally desequilibrated?)

I hope you don't hate me for this. I once engaged in a similar endeavour, a few years ago. I tried to build precise definitions out of very clear concepts which everyone could understand, and after years of frustration I eventually concluded that it was possible, but it had already been done and it was called "physics".

Along those lines, I think it's perfectly possible to come up with a clear, precise definition of consciousness, but that is what scientific researchers and analytic philosophers are already doing. Unfortunately their work will necessarily fail to answer most questions we have about consciousness, but that is only because those questions just don't have answers, period.
 
  • #329
Fliption said:
How else would you know of it? How else would you know of anything? Experience is your only access to anything.

That is plain wrong. I've never experienced the Earth from outerspace, but I know it looks blue.

How can you determine that someone has something when you aren't sure it exists?

Do you have nationality? Does "nationality" exist?

In other words, how is it the word consciousness being used in language is proof that everyone is conscious yet it isn't proof that "something" exists we are referring to as consciousness?

The word "God" being used in language is proof that people understand, to some extent, what God is, but it is not proof that God exists. You don't need a real God to be able to think about God. You don't need consciousness to be able to have beliefs about it - all you need is to know what consciousness is.

Also, IMO proving solipsism wrong doesn't end the hard problem.

What hard problem?

That's like taking a boat to work after a major flood and then once you get to work saying "What flood? There is no flood. See, I got to work!"

No, that is like taking a boat to work after a major flood and then saying, over the phone to a lazy colleague, "what do you mean you can't get here? I did!"

Rest assured nobody will take you where you don't want to go. Certainly not me.

Also, I will argue that this method of using language to make a claim about consciousness can only be used, at the very least, to argue that at least two people are conscious.

You clearly misunderstand the point of analytic philosophy. I suggest you read up on the subject. Russell, Wittgenstein, Frege, are always a good start.

The issue still stands in that you cannot know whether anyone person or thing is conscious. I may be the only person on the planet that isn't and you would never know because me not being conscious wouldn't impact the language at all.

Well, you can say whatever you want. You can claim that the Earth is flat, that the bible provides a literal account of biological history, that the United States is about to be taken over by the UN. You can make arguments out of knowledge, or you can make arguments out of ignorance. The choice is always yours.

Your statement above is just plain wrong.

Cheers :smile:
 
  • #330
vanesch said:
I'm not well versed in philosophic terminology, so could you give me your definition of "materialism" ? I thought it meant that everything was explainable in physical laws, but probably it is more subtle than this.

cheers,
patrick.

Well perhaps materialism isn't the exact word. I think your definition of materialism is probably close (if we can ever decide what physical means) but if we assume that to be physical means to be able to objectively observe and study, then this alone leaves consciousness out. We would just have to accept experience as a fundamental given in the universe. Now if things started happening like you descibe, i.e. consciousness merging, then this to me just confuses the definitions of physical and non-physical and they don't mean a whole lot at that point because would gaining knowledge this way be considered objective? Who knows. So the word materialism likely wouldn't mean much either.
 

Similar threads

  • · Replies 58 ·
2
Replies
58
Views
3K
  • · Replies 190 ·
7
Replies
190
Views
15K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 2 ·
Replies
2
Views
296
  • · Replies 17 ·
Replies
17
Views
3K
Replies
29
Views
3K
  • · Replies 22 ·
Replies
22
Views
4K
  • · Replies 1 ·
Replies
1
Views
285
  • · Replies 209 ·
7
Replies
209
Views
16K
  • · Replies 7 ·
Replies
7
Views
529