Why reductive explanations of consciousness must fail

In summary, Chalmers argues that while physical explanation is sufficient for explaining structures and functions in many domains, it is unable to fully explain conscious experience. This is because conscious experience is not just a matter of structures and functions, but also involves the subjective experience of being aware. This cannot be reduced to purely physical processes, as it is conceptually coherent that these processes could exist without experience. While physical explanations have been successful in many other domains, they are unable to fully explain consciousness. This is because consciousness is a unique and puzzling phenomenon that cannot be fully understood through reductionist methods.
  • #176
Hypno

I usually agree with you but you seem to have simply assumed here that consciousness is caused by brain. Or have I misread your words?
 
Physics news on Phys.org
  • #177
Canute: I don't think it is a 'simple assumtpion', but rather a well thought through belief that the brain is the cause(/seat/existence/receptor) of consciousness.

And yes, once again Hypnagogue, I understand your point and accept it. I really do appreciate how clearly you explain yourself.

So, let me try to state things as I now understand them. You do not necessarily disagree with my last post, and in fact you probably agree with it: There is reasonable evidence to accept that there is necessarily something within the brain/biology which 'gives rise' to consciousness. The problem still remains though, that our understanding of the brain via materialism still simply cannot ever explain how that comes about.

LOL. OK, saying that now, I realize that no new ground has been reached. This is still exactly the initial complaint. Sorry about that, but it's all in good fun still.

I have always agreed with your position, since before I ever did any philosophy of the mind and then after as well. It was part of my rational of the universe that there is 'the objective', and then there is 'the subjective'. The subjective comes about through the objective, but obviously it is different. The objective is the universe, the subjective is a perception of the universe. As such, the best I could do was to postulate that there was some objective occurance/event/machinery that created subjective experience. Of course materialism will never find this subjectivity (because it only looks at the objective), but it may very well be able to find the objective machinery that causes it.

And so, yeah, this doesn't resolve the problem as such, but at least it gave me a basis for ignoring it for a while. Wait until we find the direct causality of experience and then we will start to understand whether there is an explanation for experience itself, or whether it is simply something we have to accept as a phenomenon...

I dunno. I am rambling. This topic does naught but confuse me everytime I go there. I suspect that that is precisely why no progress has been made in it since the beginning of time.

Shane
 
  • #178
Originally posted by Canute
Hypno

I usually agree with you but you seem to have simply assumed here that consciousness is caused by brain. Or have I misread your words?

I don't think the brain necessarily causes consciousness as such, but I do think it's obvious (or at least, we have very good reason to believe) that the consciousness that we experience systematically varies as a function of brain activity. This does not imply that the brain actually creates or comprises consciousness; it could be that consciousness (or microphenomenological 'things') exists independently of the brain and brain activity somehow manipulates the pre-existing 'thing.' (I use 'thing' very loosely here, since whatever that 'thing' would be in this case, it would be ontologically distinct in several important ways from physical things, or at least our materialistic conception of physical things.) Either way, it is still reasonable to assume that (for instance) someone with similar brain activity as mine in a similar setting as I am in will have similar experiences.
 
  • #179
Originally posted by Another God
I have always agreed with your position, since before I ever did any philosophy of the mind and then after as well. It was part of my rational of the universe that there is 'the objective', and then there is 'the subjective'. The subjective comes about through the objective, but obviously it is different. The objective is the universe, the subjective is a perception of the universe.

I don't think subjective experience should be thought of on the most fundamental level as a perception of something in the way I understand you to mean it, which is as a representation. The way I see it, experience on the most fundamental level is a property of nature just as much as mass or charge, albeit with obvious ontological differences; experience in itself is just experience, a neutral natural phenomenon that need not be inherently representational. Obviously life has evolved to use experiential phenomena as representations of objective reality, but that is just a function that experience has come to serve for life over time rather than an actual inherent property. (By analogy, even if we always observe color to be an element of sexual selection in a species of birds, it doesn't mean that color is inherently sexual-- that is just a function that color has come to serve for the birds over time.)

And so, yeah, this doesn't resolve the problem as such, but at least it gave me a basis for ignoring it for a while. Wait until we find the direct causality of experience and then we will start to understand whether there is an explanation for experience itself, or whether it is simply something we have to accept as a phenomenon...

If Chalmers' argument is correct, then it is impossible even in principle to fully explain consciousness solely in terms of materialistic entities and properties. Even given a perfect theoretical mapping of brain states onto conscious states, we would still not be able to dispense of the explanitory gap using only a materialist framework. (I suggest you read the two articles linked to in my last reply to Dark Wing if you are interested in a more thorough argument.)
 
Last edited:
  • #180
Originally posted by Another God
The obvious next step is to realize that with this constant throughout nature, it is feasibly to claim that there is something about the brain which necessarily gives rise to experience.

AG, I commend you on your comments. It seems you have understood Hypnagogue's posts and even linked it to your own thoughts on the objective and subjective. I understand what you're saying and agree for the most part. I do have a question regarding the above quote however. Can I reasonably state this and it be just as accurate...

"The obvious next step is to realize that with this constant throughout nature, it is feasibly to claim that there is something about radios which necessarily gives rise to music."

Yet, radios don't compose music. Tell me your thoughts on this analogy. To make the anlogy work, assume that you know nothing of how radios work.
 
Last edited:
  • #181
Unfortunately there is no evidence that anything about the brain gives rise to consciousness. As Hypnogogue says, representional (or intentional, phenomenal etc) consciousness seems to be an evolved product of brain (or vice versa) but nothing suggests that brains are a necessary prerequisite for consciousness. Strictly speaking it isn't even completely clear how to define 'brain'. When you look at the behaviour of microphages or slime mould you have to wonder.
 
  • #182
Originally posted by Canute
representional (or intentional, phenomenal etc) consciousness

Actually, I was trying to say in my response to AG that representational and phenomenal content are two distinct things. The representational content of a subjective experience is an exitrinsic, relational property, an 'aboutness' that relates it to objective reality. The phenomenal content of a subjective experience is an intrinsic, inherent property that is entirely self-contained.

For instance, take the subjective experience of the blueness of the sky. This subjective blueness has a representational content insofar as it represents information about the objectively existing sky 'out there.' But this is to be distinguished from its phenomenal content, which is simply its inherent property of perceived blueness. The former seems to be a convenient usage of the latter that life has evolved, but in its most stripped down sense there need not be anything representational about a phenomenal percept.
 
  • #183
You're basically right, I was sloppy. However although I agreed completely with you said earlier about this I'm still slightly unsure whether 'phenomenal' consciousness is fundamental, which you implied.

I suspect that a state of 'emptiness' is not properly described as 'phenomenal consciousness', at least in the sense of 'pertaining to a phenomena'. That's why I lumped phenomenal consciousness in with the rest.
 
  • #184
Where does a state of 'emptiness' come into play?

Actually, I'm not sure I entirely understand that second paragraph you wrote.
 
  • #185
I was agreeing that phenomenal consciousness is intrinsic, not dependent on representations, and is more fundamental that representational consciousness. But in a not very clear way I was also suggesting that the fundamental consciousness that you argued may be an irreducible property of nature may not be phenomenal or intentional consciousness either. This is because it seems to me that fundamental experience of consciousness cannot really be called 'phenomenal'.

In a sense there is a phenomenon, namely the experience. But at the limit the phenomena is the experience, as opposed to being separate to it, whereas 'phenomenal consciousness' suggests (to me anyway) an experience somehow separate to the experiencer, in other words a phenomenon and a consciousness of a phenomenon.

God I'm pedantic sometimes. Sorry.
 
  • #186
In my mind phenomena is a word that describes the felt experience. So you can't have it separated from the consciousness.


Fliption: I will have to think about it for a while. I keep tossing around the idea that radios do more than just music, they do talk back, advertisements, static etc. Perhaps it would be more accurate to say radios necessarily give rise to noise.

With this new version though, it seems much more appropriate to me to say yes, the analogy may be correct, although I am sure your point would be missed. You wouldn't like the thought of 'static' being just as meaningful to the radio analogy as consciousness is meaningful to the brain. But that is the only way i can see the analogy as being useful.
 
  • #187
AG,

I think "noise" may technically be accurate, but I'm not sure it's relevant to the analogy. Assume you're an alien with no knowledge of radios and you have a radio that will only play one station. You cannot change stations to hear static. This station can play music or have people talking, it doesn't matter. You have knobs to control volume and EQ. Couldn't you, with the limited knowledge of an alien, make the same quote about this radio/music/talking that you made about brains/consciousness?

Here's the point: If you agree that this analogy applies then we have a good example of a situation where having good reasons to believe one way would actually be leading us astray. In this case, that music/talking is an emergent property of a radio's design. (I tried to leave out static because while it is obvious that voices and music are not generated by the radio, I'm not sure about the technical origin of white noise. The importance of static isn't so relevant. I'd include it if I knew you wouldn't say that "the radio does generate the static". Because I'm not technically savy enough to debate that :smile:)
 
Last edited:
  • #188
Originally posted by Another God
In my mind phenomena is a word that describes the felt experience. So you can't have it separated from the consciousness./B]
I'll try to be more clear about what I meant. It's important because if consciousness is not reducible then it must be consciousness in its most 'unevolved' state, or rest state, that is fundamental.

Often this fundamental state is taken to be phenomenal consciousness, as I think Hypnogogue did above, and Chalmer's does also. However the term is ambiguous and therefore can give rise to misunderstandings.

This is Austen Clark from http://www.ucc.uconn.edu/~wwwphil/pctall.html

"States of phenomenal consciousness involve a special kind of quality: phenomenal qualities. A state of phenomenal consciousness is a state in which something appears somehow to someone, and phenomenal qualities characterize that appearance."

This is the danger, that phenomenal consciousness is defined too narrowly, as a state in which 'something appears somehow to someone".

My point was that there are states more fundamental than this, states in which there is no distinction between the something that appears and the someone to whom it appears, and maybe 'phenomenal consciousness' is not the right term for them, given the above kind of useage and definition.
 
Last edited by a moderator:
  • #189
I think phenomenal consciousness could just as well be 'a state in which something appears somehow.' That's the way I have been taking it, at least. The 'to someone' part implies the presence of a further 'something that appears somehow,' which is the appearance of that 'someone' to whom the experiences are happening, or the appearance of selfhood. So to define phenomenal consciousness in the way you are using it may be a little circular or recursive. For instance, I would define the phenomenal consciousness of red as 'the appearance of redness,' whereas under the definition of phenomenal consciousness that you suggest, it would be 'the appearance of redness in conjunction with the appearance of self.'

In any case, it just a matter of how we define our terms. But I agree with your central point, which is that experience on the most fundamental level need not necessarily include the experience of selfhood.
 
  • #190
Originally posted by hypnagogue
I agree that, for a theory of consciousness to make sense, it must make reference to some sort of building blocks for consciousness; either in the form of an irreducible and fundamental entity, or in the form of some 'things' that are not themselves conscious but somehow combine to create consciousness.

Now the question becomes: are these building blocks included in our contemporary materialistic ontology? This is precisely where I believe that contemporary materialism must fail in any attempts to really explain consciousness, because I do not think any of the building blocks given to us in a materialistic ontology can do the job of showing us how to explain or deduce consciousness. We need more building blocks.

I've decided to go back... if NOT to the "beginning of time" ...then at least to page 13 of this thread ...and to devote the next hour to catching up with as many posts as have caught my eye.

Let me first see if I get the term "contemporary materialistic ontology" as meaning how "we" (i.e., "scientists") currently think matter is behaving? Please correct me if I got it wrong.

Anyway, based on this definition, I will squeeze in my paradigm to fit.

Matter -- actually "bound-up energy" -- is "behaving" as it ALWAYS has: by "sensing" one another (either as particles or systems) and "responding to" one another. We can discuss examples, but my contention is tha THIS represents "consciousness" at its bare minimum.

Might this not be the "building blocks" you've been hoping for?



As a side point here: your criterion for judging whether an entity is conscious or not is the degree to which it can interact with and be conditioned by its environment, why should physical constitution matter? I understand that you want to start off on surer footing by starting with safer assumptions, but we could (relatively) easily build a silicon based robot that could do the same things. All I am suggesting is that, if biological constitution is to be the most fundamental factor for consciousness in your hypothesis, then that should your primary assumption. Deriving (as opposed to fundamentally asserting) the necessity of biology for consciousness from an entity's ability to interact with the environment seems to be faulty, since you could just as well derive that a silicon robot should be conscious by the same criterion.
The DETECTION and RESPONSE TO stimuli ("forces") would be all that is necessary. "Conditioning" might BE a "response" but it is beyond my basic parameter for "consciousness".

Thus, I do not propose that "consciousness" is confined to biological systems and, while some may want to discuss "robots" there are plenty of non-biological systems (atoms through galaxies) that are "communicating with one another" even as we speak.

Like you, I have been weary of functionalism as a good starting ground for any hypothesis for consciousness. However, recently I read an argument with a functionalist flavor put forth by Chalmers that gives me pause. If you are interested, it might be appropriate to start another thread on the topic.
Wish you'd put it here in ONE SENTENCE. I'm hesitant to "go shopping" for additional threads to tempt me into discourse.



OK, back to the building block discussion. Suppose for the sake of argument that we eventually isolate the motion of electrons as the most fundamental necessary and sufficient physical correlate for consciousness: whenever we see electrons moving about in such and such patterns, we are confident that there will be such and such conscious experience on the part of the system.
First, why would it be necessary to "isolate the motion of electrons". We "know" they're always moving don't we (a serious question)? Perhaps it IS their "motion" that allows them to "pick up on" the presence of OTHER particles (like the positive charge of a proton) but then this would "simply" be part of the MECHANISM for "basic consciousness". No. All we have to do is consider whether the electron's "detection" and "response to" the proton can be considered a basic "unit of awareness".

Now, what in our materialist ontology could account for this? Electron charge? Electron mass? Electron spatiotemporal configuration? What combination of these could you throw together to show a priori that consciousness must be the result? I argue that no combination of these could be thrown together to show that consciousness must result. Rather, at this point, we would have to rework our ontology to grant an entirely new property to electrons, such that we would be able to see a priori that such and such configuration of electrons must result in consciousness. This new property would have to be either a fundamentally irreducible aspect of consciousness on the part of electrons, or it would have to be some kind of microphenomenological property of electrons such that electrons by themselves are not conscious, but when combined in patterns just so, their microphenomenological properties combine to result in consciousness.
There are MANY "forces" that "account for this: electrical charge; magnetic force fields; ionic fields; the em spectrum. Anything that mediates INFORMATION "accounts for" an entity's or a system's "ability" to "sense" and "respond-to" something else. Nothing "new" is needed.

This argument applies to any H2O formula we may wish to hypothesize for consciousness. You say we have not found the formula yet; I say that for any formula built solely from materialist building blocks, we will still not be able to show a priori that this formula must necessarily result in consciousness. We just need more building blocks than materialism will give us.
It is probably true that we will not be able to EVER "show" that electronics, atoms and galaxies are "conscoius" ...but we can THINK about it well enough.



I don't think anyone will argue that it is impossible for a person with the right configuration in our world not to have a mind. The question is whether or not it is a metaphysical impossibility; if the world were different somehow, would consciousness still be the necessary result of the right brain configuration? For instance, in our world it is impossible for an electron not to be attracted to a proton. In a metaphysical world with different laws of physics, this would not necessarily be the case.
First, there isn't any "wrong" "configuration" IF Everything has a "mind" ...of sorts. As to the mind's dependency on having a brain? I contend not. And a world -- both physical AND metaphysical -- with different "laws of physics" -- all that would be needed, once again, is that the parts of the "new Universe" be detecting and responding to one another for "consciousness" to be present. And, of course, having all parts NOT sensing and responding would then give us a STATIC -- hence DEAD -- Universe ...and nobody wants THAT.

It is a metaphysical impossibility for a world with identical H2O molecules and identical laws of physics to ours that these H2O molecules not combine to form (given suitable circumstances) a macrophysical substance with properties identical to water in our world. A very straightforward argument involving physical structures and functions can be given to support this claim. It is not at all clear, however, that a metaphysical world that is physically identical to ours should necessarily support consciousness. If it is claimed that this metaphysical world must support consciousness, no substantive argument can be given to support this claim, even in principle, for all the familiar reasons. This is another way of getting at the suggestion that there must be something more than just the physical involved in the phenomenon of consciousness.
Actually, I think I have given a good "substantive argument" to "support the claim" that ALL DYNAMIC WORLDS with systems that are "working together" will have "consciousness" at its most fundamental ...as well as its most complex.


I agree that there is no logical reason to say that it is not in virtue of some property of the brain and its constituents that consciousness exists. However, there is much logical reason to say that such a property is not included in our current materialist ontology.
The "brain" is an evolutionary "afterthought".
 
  • #191
Originally posted by hypnagogue
The idea is not that your artificial bat must be a zombie. The idea is that we can't be certain what effect the different physical constitution has on its purported consciousness. Upon what basis do you claim that the artificial bat must have the exact same experience as the natural bat?

The argument you have put forth so far leaves much room for doubt. In fact, it does little more than beg the question; your argument rests firmly on the assumption that physical transitions and such are all that is responsible for consciousness, whereas this is precisely the issue that is open to question. To advance, you must propose an argument detailing how it must be that the functional constitution of the bat is sufficient for explaining its first person experiences.
Would not ANY entity subject to CAUSE & EFFECT be having an "experience"? It's simply a matter of complexity of both the detection apparatus and the responsive possibilities of whatever.

And if I make it off page 13, I'll consider this hour well spent.
 
  • #192
Originally posted by selfAdjoint
I'm trying to firm up the discussion by pinning down the issues in a case where we don't have all the baggage of human consciousness to contend with. IMHO that's exactly what Nagle did in switching from talking about qualia in people to presumptive qualia in bats. The point is exactly that nobody knows what goes on in a bat's mind so the discussion can remain pure of special pleading.

If you don't like the bat, here's another one. Could an AI be built to sense colors the way people do, with the three receptor bands and intensity differencing and maybe a neural network for identification and memory, and if it could then be run through experiences with colors, some good some bad according to a carefully designed program so it had various associations with various colors, and if it then "discussed" its experience of colors with researchers and showed complex discussion behavior, not programmed in advance, could you then say the device was experiencing color qualia?
I say yes. And of course you know that there are creatures that can sense wavelengths that we cannot, in addition to magnetic and ionic field, chemicals and other "mediators of information". Nor can we "sense" -- from our vantage point -- the "weak and strong forces" that elementary particles "sense" and "respond to".
 
  • #193
M. Gaspar, your position assumes that information transfer / manipulation is inextricably bound up with consciousness; you say that whenever information is exchanged (detected, responded to, etc.), there will be some sort of attendant experiential component.

This may well be the case, but if it is, it is not something that can be accounted for in the traditional materialistic framework. There is no postulate in materialism that states that information transfer is inherently experiential. By the materialist account, information transfer between two atoms (say) should just be characterized entirely by physically detectable energy tranfer between the two; starting from materialist assumptions, we should have no reason to suspect that such an information transfer has anything at all to do with experiences.

So your hypothesis of the relation between information and consciousness would qualify as the beginnings of one possible type of "fundamental, irreducible addition" to the materialist framework that I have been talking about. It would function much like the traditional laws of physics; we would not be able to deduce why this relation holds, we would just accept it as a fundamental and contingent way in which our world works. But as it stands, no such fundamental principle exists in our materialist account of reality; there is no "law of conscious information transfer" in the materialist account, or anything even analogous to it. That is the point I have been trying to make.
 
Last edited:
  • #194
Originally posted by hypnagogue
So you are attempting point out the uncertainty of our knowledge of consciousness? I don't think any except the most extreme on either side really dispute that notion. At this stage of our understanding (and possibly forever), we just don't know enough to answer your question with much more than educated speculation. But this is a different matter from the subject of whether or not materialism can explain consciousness in principle.

Done! ...both page 13 and a proposal of how "materialism can explain consciousness in principle."

No time to read -- let alone respond to -- the next 4 pages ...which is why I am not CONSCIOUS of their content ...yet.
 
  • #195
Originally posted by hypnagogue
I think phenomenal consciousness could just as well be 'a state in which something appears somehow.'

Fine by me, (although I'm still a bit wary of the term 'appears')
 
  • #196
Originally posted by Dark Wing
...the materialistic stance is quite shallow, and no doubt it needs to be fleshed out, but the basis for it may still be there. Take a look at Place and Smart's work on identity theory: (I know they are Australian, but hell, we do have some minds all the way down here)It simply states that’s a Mind state IS a Brain state. What they have done is set up a field to explore: what is a brain state? if you can figure out what that is, then you have the next step to the reduction: take it down to biology, and then ultimately physics, and you have your building blocks for consciousness: but I will address that better where you have mentioned it bellow.
In the final analysis, the brain could "merely" be an organic device designed through evolution to detect a wide variety of signals, and to interpret them, store them and respond to them. The fact that we have developed instumentation to further gather data (signals) gives us a "consciousness" that is "aware" of objects that are "invisible" to us by wavelength and by distance. Still, "consciousness" could just be the sum product of what any entity (from particle to any dynamic material system) can detect/interpret/store/and respond-to.

So, yes, OUR "mind state" could be the same our our "brain state" ...but it is not necessarily the only TYPE of "mind" there is. We may just be on the "high end" of a "consciousness continuum" that is based on the complexity of detection and response possibilities of any given entity. Other entities less endowed ...like, say, an electron, might still have a "mind" by virtue of what IT could "detect" and "respond to" ...which would be, say, the positive charge of a proton.

Looks like I'm going to have to tackle the lengthy Dark Wing post on page 14 a bit at a time. No time to respond to it all now ...but don't want to skip it on my way to page 17.
 
  • #197
Originally posted by Dark Wing
I think that we should at least start at a point where we know that consciousness is the case. (i am aware that people will argue that we are not conscious, and that we are all just robots, but i am going to presume consciousness on the basis of Searls "seeming" argument). If it is so that biology is conscious, then we can figure out what the constitution of biology is, and then see what the essential ingredients of the physics/biology boundary are. we can then say that they are the essential building blocks of consciousness, as they make biology, and biology is conscious, as it can react and interact with its environment. We can never argue necessity of biology for consciousness. But we can say "check it out, we have a working example, let's see how that happens"
You my SAY that "we can never argue (the) necessity of biology for consciousness" ...but there are many -- maybe most -- who do just that. I say "consciousness" is NOT dependent on biology, but that biological organisms HAVE developed the capacity to "sense" more "stimuli" than, say, a rock. But a rock "senses" gravity ...and some rocks "sense" magnetism. And I say that each "detection and response" is a "level" of consciousness.

So, what makes a biological cell that is reacting and interacting with its environment different from a robot that is showing the same behavioral patterns? nothing according to that definition. so the theory has to be expanded to show us how to tell immitation from the real thing (it is called "artificial Intelligence" after all :eek:)
If the reception, processing and response to information is the sole parameter and if a biological cell and a, what, nano-robot? could perceive EXACTLY the quantity and quality of incoming information, then there would be no distinction between their respective levels of consciousness. However, a living cell, I believe, probably DOES have the capacity to detect MORE than the nano-robot ...such as chemical information ...unless, of course, the robot was made to detect the exact same things.

Besides that my explanation of consciousness is based on biological or at least physical causation, and that programmed robots ignore the causation part of the initial condition for consciousness and just write the consciousness on top to be run on a bunch of silicon mapping, there seems to be a tesable and varifiable way of seeing if a robot is conscious in the same kind of sense that a human is conscious: that it attributed meaning to its environment.
Qhy do you say that a programmed robot "ignore" physical causation of consciousness? If "detection" and "response" is within the physical realm, then consciousness has been caused by physicality.


It is reacting in a meaningful and productive way TO ITSELF as well as to the environment.
Is one elementary particle "reacting in a meaningful and productive way" when it SENSES and RESPONDS TO other particles via info mediated by the weak and strong forces? I think it does.

not so much "these atoms moving like such means we will have this conscious experience" more i am saying that a certain formation of atoms will produce consciousness in the system
No, not "certain formation of atoms" but ALL formation of atoms...

...the nature of the conscious experience will be dictated by the biology: what kind of biology does this thing have in order to experience the environment with?
The only relevance to biology is the fact that biological organisms are more apt to have more sensors plus a larger repertoire of possible responses to stimuli. In and of itself, consciousness is NOT dependent on biology.

all of that is higher-level stuff that we may or may not predict on an atomic level. all i am interested in is what combinations make consciousness possible:
Answer: the detection and response to information.

...experiencing consciousness is another question all together.
Anything that is detecting and responding to something is having an "experience" ...which includes EVERYTHING as there is nothing (that I can think of) that isn't detecting and responding to SOMETHING ...even at the QM level.
It’s the combination that matters. certain combinations make one thing, other combinations make consciousness.
and if you know the combination that makes something biology, then you will know a priori that a certain amount of yay atoms on this combination will make consciousness. It’s like baking a microscopic cake.
And I propose that any operational system -- from atoms, to bugs, to stars and galaxies up to an including the Universe Itself -- is "experiencing consciousness". It's just a matter of DEGREE based on WHAT can be perceived and the range of FREEDOM the system has to respond.

The metaphysical question of "even if it might not be so in our universe, but is it possible for consciousness to NOT result by this mix in another universe" is to me a wonderful question to speculate, but essentially one with no answer. how can we ever know whether consciousness of this sort is contingent here or a necessary factor of existence? that sort of thing keeps one awake at night.
I sleep very well "knowing" that "consciousness" is FUNDAMENTAL to every constituent part of the Universe ...and has been so since the Universe was compressed into the "Primal Singularity" that preceded the last Big Bang. That's when the Universe "lost It's marbles" ...but It's more coherent now. :wink:

...even if in another universe something other than the pure physical is needed to support consciousness, it means nothing to us here. I will argue that we have all the ingredients for consciousness right here in front of us, we are just not looking hard enough for them.
One does not have to "look hard". One just has to look.

The above of course is merely speculation on my part, put forth with an air of "certainty" because I like to pretend I'm right.
 
Last edited:
  • #198
Originally posted by Another God
So, if it is up to the combination, perhaps there is a critical distinction that needs to be made which I have never seen anyone make: Perhaps there is no such THING as consciousness, perhaps there is a myriad of phenomena that each may be 'conscious experiences'.

So a reductive explanation of 'Consciousness' will fail, because there is no such thing as 'Consciousness', there is instead attributes of consciousness. If you follow me...

I guess this is similar to saying there is no such thing as 'The Biological World', there are only creatures which may be said to be biological.

Does this make sense/Help?

Yes and no: it "makes sense" but doesn't "help".

I agree that the "myriad of phenomena" we have come to identify as "consciousness" may be something else. However, it is just as likely that there is a myriad of phenomena (like the detection of the physical forces ...or possibily "thought" itself) that is going on between systems that we have NOT come to identify as "consciousness" and yet ARE examples of it.
 
  • #199
Originally posted by hypnagogue
A lot of philosophical considerations point to consciousness being a fundamental aspect of reality (this thread for example). But supposing that consciousness is on some level fundamental is actually the antithesis of a reductive explanation.
Now THIS I do not understand. How is what I've been saying the "antithese of a reductive explanation"? Do I not understand the word "reductive" has meaning to "reduce down" to is barest essentials. Please advise.

As for the biological view, I don't think it's elitist as much as it is pragmatic. We know for a fact that humans are conscious and we have good reason to believe that other animals are conscious as well. The further the systems we consider stray from being human, the less confidence we can have that these systems are conscious. So it is more a matter of starting in an area where we can be confident, learning what we can from that starting point, and then extrapolating to more general systems as our knowledge and theoretical frameworks progress. It may be true that an amoeba (or a rock) is conscious on some level, but for now that is just speculation.
You're reminding me of the story of the guy who lost his keys at night, and the only place it made sense for him to LOOK for them was under the street light ...as everywhere else was DARK and so he wouldn't see them even if they were there.

Of course it's "speculation" ...and may ALWAYS be. Still, I don't mind "groping in the dark" for the "keys".

I am here to present as good a case a possible ...but I don't expect to "prove" anything.
 
  • #200
Originally posted by hypnagogue
I don't know how I feel about that. You can perhaps say that there is no intrinsic property that differentiates a biological system from a non-biological one, but from the 1st person view at least, there seems to be an obvious intrinsic difference between a conscious system and a non-concsious system.

Again: all systems are conscious in MY paradigm.

Besides, even if we accept that what we need to describe are attributes of consciousness, all the familiar arguments still apply as to why we could not explain these attributes reductively in the materialist framework.
The "problem" with "describing attributes" are that we (human beings) tend to equate everything with US! If it mothers it's young and steals a banana, it's conscious. If it scampers away from us (like a coackroach when it SEES us!) it's "merely" "reacting" in a mechanical sort of way.

I believe that we have been too narrow in our definition of what constitutes consciousness. If one uses my parameter, one could actually QUANTIFY "consciousness" along a continuum based on what a system (or particle) can detect at a given point in time and its possible range of responses.
 
Last edited:
  • #201
Originally posted by M. Gaspar
Now THIS I do not understand. How is what I've been saying the "antithese of a reductive explanation"? Do I not understand the word "reductive" has meaning to "reduce down" to is barest essentials. Please advise.

To explanitorily reduce a phenomenon P here means to describe how the properties of P come about due to the combination of some other set of phenomena P'. For instance, to give a reductive account of the properties of water means to describe how the properties of water come about due to the electrostatic properties of H2O molecules, under suitable conditions of temperature and pressure and so on. So if we presume that consciousness is ontologically fundamental / contingent in nature, ie that it simply exists in its own right and is not composed of smaller 'things,' then we are doing the exact opposite of an explanitory reduction. Instead of showing how consciousness is composed of other, more basic phenomena, we are instead claiming that consciousness cannot be 'reduced' in such a way at all.

I am here to present as good a case a possible ...but I don't expect to "prove" anything.

I appreciate you presenting your case, but perhaps it would be advisable to begin a separate thread on your personal hypotheses of consciousness. This thread is intended to discuss the philosophical feasibility of giving reductive explanations of consciousness in particular, and it is turning into a bit of a 'grab bag' for all things consciousness, detracting from the focus on the main lines of discussion. I welcome your views and will be glad to discuss them with you, but again, this particular thread seems to be getting sidetracked a bit.
 
  • #202
Originally posted by hypnagogue
To explanitorily reduce a phenomenon P here means to describe how the properties of P come about due to the combination of some other set of phenomena P'.
Is this not what I have been doing by explaining the phenomenon of consciousness as being due to the combination of some other set of phenomena ...namely, the DETECTION and RESPONSE TO "information".

Consciousness is indeed a phenomenon -- a PROCESS -- that I propose goes on at every level of physicality by virtue of the sense/respond phenomenon. It then becomes a matter of complexity of information that be PERCEIVED and the potential repertoire of RESPONSES that determines the "degree of consciousness" that an entity "possesses".

For instance, to give a reductive account of the properties of water means to describe how the properties of water come about due to the electrostatic properties of H2O molecules, under suitable conditions of temperature and pressure and so on. So if we presume that consciousness is ontologically fundamental / contingent in nature, ie that it simply exists in its own right and is not composed of smaller 'things,' then we are doing the exact opposite of an explanitory reduction.

But I am NOT saying that consciousness is a fundamental "ingredient" of the Universe but the result of a "process" that IS intrinsic to the Universe. The Universe, after all, "hangs together" by VIRTUE of its "ingredients" sensing and responding to each other. Do you not think we are only AMONG a myriad of temporary systems that are operating in a "Cause & Effect" Universe?

Instead of showing how consciousness is composed of other, more basic phenomena, we are instead claiming that consciousness cannot be 'reduced' in such a way at all.
Do you still say I am violating reductionism?

I appreciate you presenting your case, but perhaps it would be advisable to begin a separate thread on your personal hypotheses of consciousness. This thread is intended to discuss the philosophical feasibility of giving reductive explanations of consciousness in particular, and it is turning into a bit of a 'grab bag' for all things consciousness, detracting from the focus on the main lines of discussion. I welcome your views and will be glad to discuss them with you, but again, this particular thread seems to be getting sidetracked a bit.
Actually, I did start a thread -- the "Consciousness Continuum" -- but no takers. I think I will read the article you recommended first so that I may have something to "respond to" myself. Then I will present my proposal under Theory Development.

Meanwhile, it was not my intention to "hi-jack" this thread but to use my "personal hypothesis" as an EXAMPLE of "why reductive explanations of consciousness" NEEDN'T fail.

I'll be going back to pg. 14 of this thread to see if there is anything more I can insert -- or assert -- that is in keeping with the thrust of this thread. Otherwise, you've saved me a few hours of posting ...which makes up for the cartridge you owe me. :wink:
 
  • #203
Gaspar

Imo it would be better if you picked up the discussion from where it is, not from where it was some time ago.

Information is relevant, but information exchange cannot explain consiousness, since information exchange can occur in the absence of consciousness.

Consciousness cannot be reduced to the physical until the 'explanatory gap' can be crossed. The only alternative is to show that there is no gap. Many have tried and and nobody has yet succeeded.

One strategy is to argue that mind and brain are one thing, but even if this is judged a coherent idea (which few think it is) it does not solve anything, since it does not resolve the matter of which of mind and brain is ontologically more fundamental. At our current state of knowledge we do not know whether brain gives rise to consciousness or vice versa. Until we know this then any argument that mind and brain are indentical doesn't help to answer the ontological question. It also ignores the important distinction between mind and consciousness.

In my view the hypothesis that consciousness is more fundamental than matter has far greater explanatory power and reach then the other view, and would have been adopted long ago if it wasn't for the fact that it is considered unscientific. All the known evidence is in its favour.

Regardless of whether this is true it is interesting to come at the explanatory gap from the other side for a change, and to speculate how it might be crossed starting from consciousness and ending with brain. A recent paper in the 'Journal of Consciousness Studies' makes this proposal. As this paper got past the referees we must assume that it is not an absurd idea and does not contradict the evidence.

At the moment scientific research into consciousness is based on scientific metaphysical assumptions. All the talk is of the creation of a 'science of consiousness'. This is a very very one-sided approach and it may explain why we are not getting anywhere on this question.

All the evidence suggests that consciousness cannot be reduced to the physical. If it is fundamental, as this suggests, then it follows that is more fundamental than matter (since matter is always reducible, i.e. made out of something other than itself). Consciousness, on the other hand, need not be made out of something other than itself. That is, at the limit an experience is what it is.

I'll pin my colours to the mast for once and say that imvho there is 'something that it is like' to be nothing,(taking 'nothing' in its scientific sense), that this entails that consciousness cannot not exist, and that this explains why there is anything at all rather than nothing. It is difficult to make testable predictions from this view, but postdictions abound (e.g. the existence of the explanatory gap). It also explains why science cannot explain consciousness, for by this view science has got hold of the wrong end of the stick big time.

(Dammit, why aren't my edits being saved if I do more than one edit? Am I doing something wrong?)
 
Last edited:
  • #204
Originally posted by Canute
Gaspar
Imo it would be better if you picked up the discussion from where it is, not from where it was some time ago.
Today is Feb. 5. Page 14 was Feb. 2. I don't think we're talking ancient history here. And even if it WERE, let us not pretend that the discussion on these threads is linear in any way. What I was doing was addressing the POVs of participants vis a vis a reductive explanation of consciousness and my own particular POV.

Since then -- that is, since YESTERDAY: post above -- it has been politely suggested that I am "off topic" and so hesitate to respond to your comments about my "personal hypothesis". And now that I've hesitated ... I will proceed. However, if you -- or anyone reading this -- knows how to kick all of my posts and your subsequent ones over to my thread "Consciousness Continuum" ...please inform me how ...as this will be the last time I speak of my ideas here.

Information is relevant, but information exchange cannot explain consiousness, since information exchange can occur in the absence of consciousness.
My point is that ANY "information exchange" constitutes a "degree" of "consciousness" and that, in fact, there is never an "absence of consciousness" in a cause & effect Universe.


Consciousness cannot be reduced to the physical until the 'explanatory gap' can be crossed. The only alternative is to show that there is no gap. Many have tried and and nobody has yet succeeded.

One strategy is to argue that mind and brain are one thing, but even if this is judged a coherent idea (which few think it is) it does not solve anything, since it does not resolve the matter of which of mind and brain is ontologically more fundamental. At our current state of knowledge we do not know whether brain gives rise to consciousness or vice versa. Until we know this then any argument that mind and brain are indentical doesn't help to answer the ontological question. It also ignores the important distinction between mind and consciousness.

In my view the hypothesis that consciousness is more fundamental than matter has far greater explanatory power and reach then the other view, and would have been adopted long ago if it wasn't for the fact that it is considered unscientific. All the known evidence is in its favour.

Regardless of whether this is true it is interesting to come at the explanatory gap from the other side for a change, and to speculate how it might be crossed starting from consciousness and ending with brain. A recent paper in the 'Journal of Consciousness Studies' makes this proposal. As this paper got past the referees we must assume that it is not an absurd idea and does not contradict the evidence.

At the moment scientific research into consciousness is based on scientific metaphysical assumptions. All the talk is of the creation of a 'science of consiousness'. This is a very very one-sided approach and it may explain why we are not getting anywhere on this question.

All the evidence suggests that consciousness cannot be reduced to the physical. If it is fundamental, as this suggests, then it follows that is more fundamental than matter (since matter is always reducible, i.e. made out of something other than itself). Consciousness, on the other hand, need not be made out of something other than itself. That is, at the limit an experience is what it is.

I'll pin my colours to the mast for once and say that imvho there is 'something that it is like' to be nothing,(taking 'nothing' in its scientific sense), that this entails that consciousness cannot not exist, and that this explains why there is anything at all rather than nothing. It is difficult to make testable predictions from this view, but postdictions abound (e.g. the existence of the explanatory gap). It also explains why science cannot explain consciousness, for by this view science has got hold of the wrong end of the stick big time.

(Dammit, why aren't my edits being saved if I do more than one edit? Am I doing something wrong?)

Maybe the Universe doesn't want you to change your mind ...noting what happened to Einstein and his "cosmological constant".

In any event, I have changed MY mind, in that I want to address all you have said above, but not here ...in deference to the desires of the thread's author. Since it appears that you are even more technically challenged than I , it is probably futile to ask you how to kick this ...and all my posts since page 13 -- over to MY thread "Consciousness Continuum" to "prime the pump" as it were ...or a least put a quarter in the cup! It's lonely at the top.
 
  • #205
Ok, Ok, let's get to this again. If the argument is that materialism as it stands cannot account for all of the problem of consiousness, agreed. If it did, then there would be nothing left to debate (and what fun would that be?)

So what we need to do is look at what we believe consiousness to be. now, people wish to include subjectivness, 1st person view, seeming, feeling, emotion, everything... Ok, that's great. all of this can be consiousness.

How can i talk of consciousness on a physical leavel and then start making claims of seeming, as as it was pointed out, it does not seem like atoms and energy or matter have consiousness.

this is what i would like to say. The essence of consiousness lies in physics. why? it is through these building blocks that biology is formed. consious experience however, can ONLY be reached through biology as far as we know. biology alows us to respond and interact with our world. so all this seeming and emotion is very much a biology driven reponce to certian things. It may be found that other things that are not biology have the ability to have a consious experience, but right now we do not know that to be the case.

What would be a nice case to mention here - someone mentioned to me that violet electric light has its own consious. why? they claimed that they could make this electric light do what they wanted it to do: ie, go against its own normal behaviour pattens, and go and seek out all cancerious cells in the body (for instance) Does anyone know anythin about this? this might be a point case example of somthing that is not biology but is conscious?

About concieving that there could be a place inwhich the brain does not contain consiousness as a metaphysical question, i still don't see how that is relevant to the study of this contingent world. Sure, maybe it is the case elsewhere. conception of somthing is not a good reason to consider it. if we wanted to accept everything we concieved as an arument for the demise of a previous thought, then we would be back into superstition, not science.

Originally posted by hypnagogue
If Chalmers' argument is correct, then it is impossible even in principle to fully explain consciousness solely in terms of materialistic entities and properties. Even given a perfect theoretical mapping of brain states onto conscious states, we would still not be able to dispense of the explanitory gap using only a materialist framework. (I suggest you read the two articles linked to in my last reply to Dark Wing if you are interested in a more thorough argument.)

this is just it. we are not mapping consious states to brain staes. or vice versa. there is not seperation, and therefore no mapping can happen. it is case of "oh, look, we stimulate this neuron, and check out that laughter" it not mapping, its understanding. there is no explanatary gap, if you conceieve that biology is our link to our world, and therefore the reason we even have a consious experience. what consiousness IS is an entirly diffrent question: but we can expalain the experience.

On the one hand you present consciousness with a very behaviorist kind of flavor, saying it is nothing more than the movement of matter and energy, and that really there is no such thing as consciousness; on the other hand you say that consciousness is the case, on the basis of Searle's "seeming" argument.

No, i am saying that energy and matter make the building blocks on which a consious experience may lie. energy and matter cannot be conditioned, nor will they change their behaviour patten (lets not go down the quantem mechanics line just yet). When i say there is no such thing as consiousness, i am saying that there is nothing emergant from any form of materialism to explian conscious experience. it just is. Do we have proof that this is the case? not yet. I am simply putting this up as a new paradigm to study: instead of reducing one to the other, claim that one IS the other, and study the brain as the brain. see how it connect to our senses to create that wonderful feeling of subjectivity. there is no explanatary gap, as there is no gap at all. stop thinking of it as a problem, and see it as a contingent fact: the brain works as a processing machiene, and what it does is make us feel the environment through very physical biological ways. these processes are slowly being understood better as we go along. we know how to make people feel things by giving them certian chemicals. we can explain a lot of the world through chemical interaction. even our subjective experiences are beginning to be explained by brain movement.

Its going to take a lot of time and study. But i think its possible, if not then at least a worthy reaserch project.

as far as we stand in the CR argument, we are the chineese dude in the room. the one you can't tell from the analouge system. interesting? the problem of other minds? no doubt. But malcholm has a great paper about that where he basicly turns the problem around toshow that we should not worry about others being consious, but worry about if we are consious ourselves. I will find a link on the net to it, i only have it in paper here, I will dothat soon, and write a more formulated answer more offically to your reply post hypnagogue. sorry if this is a little unorganised, i am in a bit of a rush.
 
  • #206
Originally posted by Dark Wing

No, i am saying that energy and matter make the building blocks on which a consious experience may lie. energy and matter cannot be conditioned, nor will they change their behaviour patten (lets not go down the quantem mechanics line just yet).

Sorry, Hypnagogue, but I must reply to Dark Wing here:

Of COURSE energy and matter "change their behavior patterns" based on information they receive/perceive from each other. Does not the electron shift it's orbit based on something it's detecting? Do not elementary particles "stick together" due to certain forces THEY are detecting. Do not galaxies "hang together" because of "information" THEY'RE detecting (gravity for starters). As to QM? Don't get me started.

Again: It's a Cause & Effect Universe and "information detection and response" are the most basic components of "consciousness".

When i say there is no such thing as consiousness, i am saying that there is nothing emergent from any form of materialism to explain conscious experience. It just is.
Oddly, I agree that consciousness is NOT "emergent" but just "is" DUE TO the proclivity of all the parts of the Universe to detect and respond to each other.
Do we have proof that this is the case? not yet.
Sure we do (have "proof"). Everything that physicists study "proves" that "matter" SENSES & RESPONDS TO "forces". They just don't call it "consciousness". But they SHOULD.

I am simply putting this up as a new paradigm to study: instead of reducing one to the other, claim that one IS the other, and study the brain as the brain. see how it connect to our senses to create that wonderful feeling of subjectivity.
I like MY "new paradigm" better. The brain is "simply" a biological device designed via evolution to DETECT & RESPOND TO INFORMATION in a most complex way. (And not all "feelings of subjectivity" are "wonderful" as, for instance, my subjective response at being told I am not "on point".)
There is no explanatary gap, as there is no gap at all. Stop thinking of it as a problem, and see it as a contingent fact: the brain works as a processing machine, and what it does is make us feel the environment through very physical biological ways. these processes are slowly being understood better as we go along. we know how to make people feel things by giving them certian chemicals. we can explain a lot of the world through chemical interaction. even our subjective experiences are beginning to be explained by brain movement.
It's still info-exchange at its core.
 
  • #207
M. Gaspar

Sorry. I wasn't suggesting your posts weren't relevant. I just didn't want to go back over previous points in the discussion. Actually I like your information theory in many ways. I just don't see how it explains consciousness as opposed to the contents of consciousness. The question is how does information become meaning.

Originally posted by Dark Wing
Ok, Ok, let's get to this again. If the argument is that materialism as it stands cannot account for all of the problem of consiousness, agreed.
This is not quite right. It is true that materialism cannot account for consciousness, but it accounts very well indeed for the problem of consciousness.

this is what i would like to say. The essence of consiousness lies in physics. why? it is through these building blocks that biology is formed. consious experience however, can ONLY be reached through biology as far as we know.
But consciousness does not arise from biology as far as we know, so why do you assume it? Once you've assumed it the discussion is over.

if we wanted to accept everything we concieved as an arument for the demise of a previous thought, then we would be back into superstition, not science.
It is not superstitious not to make assumptions.

this is just it. we are not mapping consious states to brain staes. or vice versa. there is not seperation, and therefore no mapping can happen. it is case of "oh, look, we stimulate this neuron, and check out that laughter" it not mapping, its understanding.
If I understand you right you are arguing either for behaviourism or that consciousness is brain. Both these views have been shown to be incoherent. If they aren't then we don't need to study the brain to understand the mind, for it will be much easier the other way around.

No, i am saying that energy and matter make the building blocks on which a consious experience may lie.
That'll stay as just your opinion until someone finds some evidence of its truth.

I am simply putting this up as a new paradigm to study: instead of reducing one to the other, claim that one IS the other, and study the brain as the brain. see how it connect to our senses to create that wonderful feeling of subjectivity.
But if mind and brain are the same thing then why does it matter which we study? We can let psychologists study the brain instead of neuroscientists.

we know how to make people feel things by giving them certian chemicals. we can explain a lot of the world through chemical interaction. even our subjective experiences are beginning to be explained by brain movement.
True. But it doesn't help. Nobody argues that brain states are not causally linked to conscious states. The question is how far does this causality extend.
 
  • #208
M. Gaspar, please don't take my comments to mean I don't want you posting in this thread. I just want the discussion to stay on track. On further reflection, I think my main objection was that you posted a flurry of posts which could have been condensed into one post. But anyway, now that you're caught up and into the discussion that shouldn't be a problem anymore. So I apologize for coming off the wrong way and ask you to please not hesitate to make your own contributions to this thread.

That having been said, I would like to comment on a view that is in some degree held by both M. Gaspar and Dark Wing. Both seem to advocate "response to information" as tightly bound up with the concept of consciousness. Strictly speaking, although consciousness certainly does involve response to information, it is not a good idea to equate or tightly correlate the two. There are mounds of research suggesting that a great deal of the information processing that the brain does occurs entirely independently of consciousness. For instance, patients with blindsight can meaningfully interact with objects even though they have no visual awareness of these objects or conceptual awareness of exactly how it is that they can react meaningfully to things they can't see. This suggests that "response to information" is not a sufficient condition for consciousness; response to the environment can occur without attendant conscious experience.
 
  • #209
Originally posted by Dark Wing
this is what i would like to say. The essence of consiousness lies in physics. why? it is through these building blocks that biology is formed. consious experience however, can ONLY be reached through biology as far as we know. biology alows us to respond and interact with our world.

But response and interaction with our world are not sufficient conditions for consciousness. See my previous post.

What would be a nice case to mention here - someone mentioned to me that violet electric light has its own consious. why? they claimed that they could make this electric light do what they wanted it to do: ie, go against its own normal behaviour pattens, and go and seek out all cancerious cells in the body (for instance) Does anyone know anythin about this? this might be a point case example of somthing that is not biology but is conscious?

I find it curious that you entertain this as an example of some nonbiological system that might be conscious, when we could just as well "make something do what we want to do," ie go against its own normal behavior patterns, and go and seek out some particular type of object in the environment, by building a suitable robot. But it seems to me that you refuse to give a robot as much consideration as a candidate for consciousness as you would to a bundle of photons.

concieving that there could be a place inwhich the brain does not contain consiousness as a metaphysical question, i still don't see how that is relevant to the study of this contingent world. Sure, maybe it is the case elsewhere. conception of somthing is not a good reason to consider it. if we wanted to accept everything we concieved as an arument for the demise of a previous thought, then we would be back into superstition, not science.

Again, the conceivability argument is just another way of reflecting how consciousness is epistemologically and ontologically distinct from 'ordinary' physical phenomena. Simply put, we cannot rationally imagine a world physically identical to ours where H2O molecules do not combine to form water, but we can rationally imagine a world physically identical to ours where the neurons of a human brain do not combine to form consciousness.

The reason this is relevant is that it illustrates a fundamental difference in the way we understand and can explain consciousness vis a vis classical physical objects, and this in turn has ontological consequences-- it tells us something about how the world must actually be. Once we accept the axioms of materialism, we can show that H2O molecules form water by logical necessity, but we cannot show an analogous logically necessary link between the physical world as we understand it and consciousness, even in principle. This suggests that the model of the world put forth by materialism is insufficient to account for consciousness. If materialism/physicalism/mechanism were sufficient to explain consciousness, then we should be able to produce an argument showing how consciousness follows from their assumptions by logical necessity. If we cannot theoretically derive consciousness from these theoretical models of reality even in principle, this suggests that if the world really were as these models of reality state it is, then consciousness would not exist. But, of course, consciousness does exist. So these models must be fundamentally inadequate depictions of the world, as they have nothing meaningful to say about consciousness.

For a detailed discussion of the explanatory gap, please see the paper http://www.uni-bielefeld.de/(en)/philosophie/personen/beckermann/broad_ew.pdf, by Ansgar Beckermann. It is a bit of a lengthy read (14 pages), but perhaps after reading it you will come to a fuller appreciation for why the explanatory gap cannot be so easily shaken off. (This paper includes a refutation of the notion that simply equating qualitative properties with physical processes makes for a successful reductive/physical explanation.)

this is just it. we are not mapping consious states to brain staes. or vice versa. there is not seperation, and therefore no mapping can happen. it is case of "oh, look, we stimulate this neuron, and check out that laughter" it not mapping, its understanding. there is no explanatary gap, if you conceieve that biology is our link to our world, and therefore the reason we even have a consious experience. what consiousness IS is an entirly diffrent question: but we can expalain the experience.

From the 3rd person view, there is no problem: we excite some neurons, we observe laughter. There is a clear causal connection. But that is not the heart of the matter. The heart of the matter is traversing the gap from the 3rd person view to the 1st person view. In your example, we can observe the person's laughter, but we cannot observe his qualitative sense of comedy. We can explain his laughter as observed from the 3rd person view via a functional explanation: the activation of certain neurons leads to the activation of other neurons, and eventually motor neurons are activated which fully account for the characteristic motor behaviors of spastic breathing and smiling facial expression. But this functional 3rd person explanation cannot explain why the person subjectively experienced humor from his 1st person view.

The 3rd person view involves the straightforward causal connection from one structural/functional system (the brain) to other structural/functional systems (respiratory system, facial musculature, etc.) The 3rd-to-1st person view involves a causal connection from a structural/functional system (the brain) to a intrinsic, qualitative system (consciousness). It is obvious how one structural/functional system can causally connect to another structural/functional system, but not obvious at all how a structural/functional system can causally connect to qualitative experience. Under a materialistic framework, there is no straightforward theoretical explanation, only correlation, between the two. So it is appropriate at this point to speak of 3rd-to-1st person phenomena as a "mapping" instead of merely identifying the two. The explanatory gap perseveres (again, please see Beckermann's paper).

energy and matter cannot be conditioned, nor will they change their behaviour patten (lets not go down the quantem mechanics line just yet).

Sure energy and matter can be conditioned. In fact, in principle we can explain a person's behavioral conditioning (response and interaction with his environment) entirely in terms of matter and energy-- that is, in terms of the plasticity of his neurons, and their physical adaptation and rewiring as a function of environmental inputs. Neurons that adapt as such change their net computational processing, which in turn changes one's behavioral patterns. That is a clear-cut and conceptually complete example of matter being conditioned and changing in response to its environment, and still we have no indication whatsoever of consciousness in our explanitory model.

as far as we stand in the CR argument, we are the chineese dude in the room. the one you can't tell from the analouge system. interesting? the problem of other minds? no doubt. But malcholm has a great paper about that where he basicly turns the problem around toshow that we should not worry about others being consious, but worry about if we are consious ourselves. I will find a link on the net to it, i only have it in paper here, I will dothat soon, and write a more formulated answer more offically to your reply post hypnagogue. sorry if this is a little unorganised, i am in a bit of a rush.

I look forward to reading the paper. Still, I think my critique of the CR argument stands. The philosophical thrust of the CR argument applies to human brains just as much as it does to computers, or systems of pipes, or any other physical system.

Say the Chinese Room is the brain of a Chinese person, and the person inside the CR (or CB-- Chinese Brain) is a microscopic demon who is conscious in the same way humans are, but only understands and speaks English. (A bit of a stretch as compared to the traditional CR formulation, I know, but it still serves to illustrate my point.) If the English speaking demon inside the CB does all the CB's computations for it, the demon will have the CB interpretting Chinese symbols (as encoded in the CB's auditory neurons from external stimuli) and behaviorally responding to them (speaking proper Chinese in a meaningful way with respect to the auditory stimuli), but we have no reason to think that the demon itself will understand Chinese as a result. Conceptually, there will still be synatx (physical processes) but no semantics (awareness of the significance of the syntax) for the conscious agent inside the CR.

Of course, we may suppse that although the conscious agent inside the CR/CB will not be aware of the semantics of the Chinese symbols, the CB itself will be aware of the Chinese semantics-- it is a brain, after all, so we should suspect that it will be conscious of the information it is processing as much as we expect any other brain to be conscious of the information it processes. But to accept this position is to accept a critical flaw in the CR argument. If the CB can be aware of Chinese semantics while the English speaking demon inside it is not, then it could equally well be the case that the analogous phenomenon holds for the traditional CR argument. That is, it could be that although the English speaking person inside the Chinese Room does not understand the symbols he is manipulating, the CR taken as a system will understand the semantics.
 
Last edited:
  • #210
Originally posted by Canute
M. Gaspar

Sorry. I wasn't suggesting your posts weren't relevant. I just didn't want to go back over previous points in the discussion. Actually I like your information theory in many ways. I just don't see how it explains consciousness as opposed to the contents of consciousness. The question is how does information become meaning.

You have asked excellent questions and I thank you for them.

Let me see if I can run them through my paradigm:

What would be the "contents of consciousness"? Possibly all that an entity "remembers". Our brains, for instance, are "set up" to "store" a LOT of information (units -- or WHOLE CHUNCKS OF -- "experiences" that can be accessed, referenced and assigned "meaning" ...somehow. Still, I cannot think -- as yet -- how an entity would form "meaning". In fact, I'm not sure how to DEFINE "meaning" with regard to "consciousness. What would YOU say?

This is not quite right. It is true that materialism cannot account for consciousness, but it accounts very well indeed for the problem of consciousness.
But I have said that materialism DOES account for consciousness, by exchanging information to such a degree and complexity that "meaning" arises. Thank you for that word. It is giving me something new to think about.

But consciousness does not arise from biology as far as we know, so why do you assume it? Once you've assumed it the discussion is over.
Hold on, Canute, you're preaching to the CHOIR! I am among those who do NOT believe that consciousness is contingent on biology. Biology may have achieved an "advanced state" of consciousness ...but it is NOT the only thing that's conscious. Each cell of an organism is conscious of SOMETHING ...usually MANY things. And the parts of the atoms are "aware" of each other, as might the galaxies be.


It is not superstitious not to make assumptions.
Say again.


If I understand you right you are arguing either for behaviourism or that consciousness is brain.
Then you definitely DON'T "understand me right". The brain is an organ that enhances an entities ability to detect, store and USE information. Consciousness is NOT the brain but the RESULTS of the brain's functionality. And, while I know what "behaviorism" is, I'm not sure I understand what you are suggesting when using the word when discussing consciousness. Remember, "behavior" is basically a RESPONSE that can be either "reflexive" (hence, "primitive") or "creative" {"advanced"). Thus the reptilian brain has been "overgrown" by higher cognitive capacities that give us "free will" ...INTENTIONALITY.

Both these views have been shown to be incoherent. If they aren't then we don't need to study the brain to understand the mind, for it will be much easier the other way around.
Please say again.

That'll stay as just your opinion until someone finds some evidence of its truth.
I'm not here to "prove" anything ...just play around with idea. If I turn out to be "right" about something(s) down the road (when the physicist finally "get there" ), goody for me ...and at least I'll have this record of my "earlier arrival".

And it would be EVEN BETTER if something I say inspires someone with a much better "left brain" than mine to "prove something" through mathematics or logic (the latter of course being hopeless). My theories thus far -- for the record -- are: (1) consciousness, the product of information exchange; (2) intention -- a product of consciousness -- which impinges on the field twixt "virtual" and "manifested"; and (3) an eventual "phase transition" will turn MORE energy into matter -- the phase transition being triggered by loss of heat or density or somethin' which would be the result of the continued expansion of the Universe -- thereby "generatong" sufficient gravity to STOP and REVERSE the expansion.

Number 3 stands the best chance of being mathematically proven ...if it's true.

But if mind and brain are the same thing then why does it matter which we study? We can let psychologists study the brain instead of neuroscientists.
We need to study both to maximize each.
 
Last edited:

Similar threads

  • General Discussion
4
Replies
135
Views
20K
  • Quantum Interpretations and Foundations
Replies
1
Views
955
  • General Discussion
Replies
1
Views
2K
  • Poll
  • General Discussion
8
Replies
246
Views
30K
  • General Discussion
Replies
2
Views
2K
  • General Discussion
Replies
4
Views
647
  • General Discussion
Replies
19
Views
6K
  • STEM Academic Advising
Replies
8
Views
915
Back
Top