How is it possible to know that we feel emotion consciously?

  • Thread starter Thread starter Sikz
  • Start date Start date
  • Tags Tags
    Emotion
Click For Summary
The discussion centers on the nature of consciousness, emphasizing that while emotions are felt uniquely, they originate in the brain, raising questions about the relationship between consciousness and the brain. It argues against the idea of consciousness being purely physical or existing solely within the brain, suggesting instead that it may be aphysical or exist in another realm. The possibility of consciousness communicating with the brain is debated, with options including a physical connection or a non-physical "portal," which is deemed logically impossible. The conversation also touches on the implications of materialistic explanations of emotions and consciousness. Ultimately, the participants seek to explore the complexities of consciousness beyond conventional definitions and frameworks.
  • #31
Originally posted by Mentat
My dear friend, we've covered some of this before (remember this thread?), and some of it is as yet mystery (meaning, there is no theory to directly address it).

Believe me, I'm well aware that we've gone over this before. :wink: I just still have fundamental objections with some things you say on the subject.

Anyway, only physical things that can multi-task in the question/answer + production of multiple drafts (very interrelated concepts, btw) fashion are "conscious". Nothing else has the right qualifications, AFAIC, and the only example I can give you of something that actually does that is the brain.

This a nice hypothesis. But it's far from well-established that "a physical system is conscious if and only if it follows 'multiple drafts' processing." Has it been shown empirically that all things that follow multiple draft processing are conscious? Does there exist a system that does not precisely follow multiple draft processing but is nonetheless conscious in some sense?

We can begin to answer these questions with reference to the special case of human brains and human consciousness. But we cannot so flippantly generalize observations in the special case of humans to all physical systems. We can say something like "it is our educated guess that this system is conscious, based on certain known principles of consciousness in the context of a human brain/body." But as human brains do not encompass an exhaustive representation by any means of the types of material configurations and processes that may take place in physical systems in general, we will only have principles grounded in any reasonable degree of certainty for human brains, and perhaps certain animal brains. We will not have a complete theory that we can apply in a general manner to any given physical system, since all of our understanding will be derived by reference to the particular special case of human consciousness.

That's the all-important/constantly brought up/somehow never understood point: Subjective experience is nothing but the question/answer processes of the brain. There is no further investigation necessary, once one has observed that these processes occur in the CPU of the subject, to prove that they are "conscious".[/color]

I really don't want to sound insulting in any way, so please don't take this wrong, but do you get what I'm saying now?

I'm sorry, but I just cannot accept this, given the state of our current understanding. This stance assumes that we already have a complete, well-tested and emperically verified theory of consciousness, which obviously we do not.

Suppose we had a complete, empirically verified theory of conscious along the lines of the following: for all physical systems A, if A has physical properties X, then A is conscious; otherwise, A is not conscious. Then we could indeed examine any physical system to see if it held properties X and thus deduce whether or not it is conscious.

But to have such a complete and valid theory, we would need to have an empirically verified mapping of objective physical states to subjective conscious states. But we can only empirically verify such mappings for the human brain, and can only apply the principles of consciousness thus derived to physical systems similar to the human brain. Thus, unless we find some objective way of directly measuring the presence of consciousness in any physical system, our theory will necessarily be incomplete and fraught with uncertainties and questionable guesswork.

As per previously stated (in red) postulate (intentional stance, btw, but you already knew that), behavioral analysis is all that is necessary[/color] (provided, of course, that "behavior" encompasses activity of the brain).

Nonsense. This is true to some extent for humans, but only because we already know as human beings that we ourselves are conscious. We directly experience our own consciousnesses, thus we start from a position of empirical verification of consciousness in the case of the human brain. From this starting point, we can observe correlations between behaviors of our own with certain directly perceived conscious experiences of our own (including, most importantly, linguistic behaviors/verbal reports). We can then roughly deduce that a similar behavior exhibited by another person indicates a similar conscious experience on the part of that person. But this entire approach depends on the fact that we start from a position of direct perception (empirical verification) of consciousness.

This approach falls apart if we try to apply it to physical systems in general, and not just humans and life forms similar to humans in particular.

Consciousness is not indicated by any behaviors, it is a behavior. There's a huge difference in these two postulates.

But there is no way to known in general if any given behavior of any given physical system is conscious or not, but to be a physical system of that type in the first place. Thus, even if we accept a purely materialistic framework of consciousness, we must still speak in general of behaviors indicating consciousness, since we have doubt as to whether such and such material process is truly sufficient for consciousness in any given context. In this case, "indicates" is a concession of our epistemic uncertainty, not an assertion of a dualist ontology.
 
Last edited:
Physics news on Phys.org
  • #32
Originally posted by Mentat
Fliption, I believe Zero was referring to Sikz having separated "mind" from "brain" off-hand, when he (Zero) referred to the "flawed premise".


And this is an old philosophical issue. A philosophical opinion from both parties is expected.

Anyway, I don't just "claim" that this is wrong, I have presented logical barriers to it's being right (on other threads which you participated in), and no one has been able to combat them yet. That doesn't mean that I'm "right" either, but "right" doesn't exist in logic or philosophy and thus is not a concern of mine. If my argument holds as "valid", against all other arguments, that's enough for me.

I wasn't referring to you with my comments above. And I must politely disagree that people have been able to combat your points. Whether you view them as sufficient to counter them is a different matter. I'm not convinced any of your points are as strong as you view them and feel that they have been adequately countered.

I will always concede it is possible that I may not completely understand someones point. Especially over a medium like a forum. For example, you have said to me and to Hynagogue things like "X is not an effect of Y. X is Y". This statement to me says nothing about reality. It's just a labeling word game. This statement has no explanatory power at all and communicates nothing different to me about reality. I can't visualize any difference. Again, it's just word games. But the wiser side of me says there must be something here I'm not seeing if you keep repeating it. Perhaps this is one of those things where an analogy to get a point across is better than just repeating the same thing?

I'll promise you this. If you're points are as strong as you say they are then you'll really want to show it to me because I will become your biggest defender. That is a promise. But at the moment, I honestly just don't believe your views solve any philosophical issues on this topic. Keep in mind that IMO Dennett never solved any problem. He only defines it away. So my perspective is very different from yours and some work may be required from you to communicate the point.
 
Last edited:
  • #33
Originally posted by hypnagogue
This a nice hypothesis. But it's far from well-established that "a physical system is conscious if and only if it follows 'multiple drafts' processing." Has it been shown empirically that all things that follow multiple draft processing are conscious? Does there exist a system that does not precisely follow multiple draft processing but is nonetheless conscious in some sense?

Consciousness altogether - i.e. "awareness" - is possible in some of the more rudimentary of life-forms. It's consciousness of consciousness (also referred to as self-consciousness) and the ability for analysis that requires a CPU something like ours.

Anyway, you need to remember that, in the intentional stance, there is no distinction between the processing and the consciousness. So, such questions as "can something follow MD processing and not be conscious" or "can something be conscious inspite of not following MD" are really non-sequitors - and would be much like asking "can something be conscious without being conscious" :wink:.

We can begin to answer these questions with reference to the special case of human brains and human consciousness. But we cannot so flippantly generalize observations in the special case of humans to all physical systems. We can say something like "it is our educated guess that this system is conscious, based on certain known principles of consciousness in the context of a human brain/body." But as human brains do not encompass an exhaustive representation by any means of the types of material configurations and processes that may take place in physical systems in general, we will only have principles grounded in any reasonable degree of certainty for human brains, and perhaps certain animal brains. We will not have a complete theory that we can apply in a general manner to any given physical system, since all of our understanding will be derived by reference to the particular special case of human consciousness.

Yeah, that's mostly true. However, I think it's a big step toward understanding all of consciousness, if one understands animal consciousness. After all, nothing else on Earth is likely to be conscious at any discernable level anyway ("conscious", in this context, is under Negel's concept: If it's "like something" be that thing, then it's conscious).

I'm sorry, but I just cannot accept this, given the state of our current understanding. This stance assumes that we already have a complete, well-tested and emperically verified theory of consciousness, which obviously we do not.

Suppose we had a complete, empirically verified theory of conscious along the lines of the following: for all physical systems A, if A has physical properties X, then A is conscious; otherwise, A is not conscious. Then we could indeed examine any physical system to see if it held properties X and thus deduce whether or not it is conscious.

But to have such a complete and valid theory, we would need to have an empirically verified mapping of objective physical states to subjective conscious states.

No we wouldn't. I think there is a concept that is rather ingrained in your mind (and in most human minds) that doesn't allow for the heterophenomenological approach, but there is really nothing illogical about it. IOW, there is nothing wrong with assuming, not that "if A has physical properties X, then A is conscious", but rather "if A has physical properties X, then A has consciousness, because consciousness = X.

But we can only empirically verify such mappings for the human brain, and can only apply the principles of consciousness thus derived to physical systems similar to the human brain. Thus, unless we find some objective way of directly measuring the presence of consciousness in any physical system, our theory will necessarily be incomplete and fraught with uncertainties and questionable guesswork.

But the consciousness itself is the physical process. It doesn't "indicate" the presence of consciousness, it is the presence of consciousness.

Nonsense. This is true to some extent for humans, but only because we already know as human beings that we ourselves are conscious. We directly experience our own consciousnesses, thus we start from a position of empirical verification of consciousness in the case of the human brain. From this starting point, we can observe correlations between behaviors of our own with certain directly perceived conscious experiences of our own (including, most importantly, linguistic behaviors/verbal reports). We can then roughly deduce that a similar behavior exhibited by another person indicates a similar conscious experience on the part of that person. But this entire approach depends on the fact that we start from a position of direct perception (empirical verification) of consciousness.

But that's the whole point, we are starting from the intentional stance. Besides, and I want to be clear on this: We are not seeking empirical verification of subjective consciousness - this would imply an "apparatus" of consciousness - we are searching merely for the "apparatus" since that is consciousness[/color].

But there is no way to known in general if any given behavior of any given physical system is conscious or not, but to be a physical system of that type in the first place.

Not true...at least, not according to the heterophenomenological approach that Dennett takes (I can't state absolute truths), since we can determine the consciousness of any physical system using knowledge of MD and question/answer processes (if Dennett is right, that is).

Thus, even if we accept a purely materialistic framework of consciousness, we must still speak in general of behaviors indicating consciousness

No, no, no! The behaviors are consciousness.

since we have doubt as to whether such and such material process is truly sufficient for consciousness in any given context. In this case, "indicates" is a concession of our epistemic uncertainty, not an assertion of a dualist ontology.

But the dualism will always be implied when you separate "subjective experience" from the nitty-gritty of neurological science (the electrochemical processes themselves).
 
Last edited:
  • #34
Originally posted by Fliption
And this is an old philosophical issue. A philosophical opinion from both parties is expected.

You and I both know that there is a standing logical (not scientific, not opinionated, not philosophical, but logical) problem with the mind being anything but the brain.

I wasn't referring to you with my comments above. And I must politely disagree that people have been able to combat your points. Whether you view them as sufficient to counter them is a different matter. I'm not convinced any of your points are as strong as you view them and feel that they have been adequately countered.

That's because, for some reason, I can't seem to make the homunculun problem intelligible to you. I even simplified (or, it was simpler in my opinion) the concept to asking "what good is a monitor inside your PC?", but that didn't seem to work. What is it exactly that you don't understand about the homunculun problem (or is it just the whole concept that seems obsolete to you)?

As to my number 1 point, there have been significant responses (mostly dealing with the question of "what is physical"), but they are overcome by a scientific defining of "physical", provided we don't play the reduction game.

I will always concede it is possible that I may not completely understand someones point. Especially over a medium like a forum. For example, you have said to me and to Hynagogue things like "X is not an effect of Y. X is Y". This statement to me says nothing about reality. It's just a labeling word game. This statement has no explanatory power at all and communicates nothing different to me about reality. I can't visualize any difference.

Look, the difference is this. If I say that brain A is performing process X (the question/answer + Multiple Drafts processes of Dennett's theory, for example), and is thus determinably conscious, I am not saying "process X indicates consciousness on the part of brain A", but rather "process X is all that there is to consciousness, and thus the statement 'brain A is performing process X' is precisely equal to (I can't seem to make the math symbol work here) the statement 'brain A is conscious'".

I'll promise you this. If you're points are as strong as you say they are then you'll really want to show it to me because I will become your biggest defender. That is a promise. But at the moment, I honestly just don't believe your views solve any philosophical issues on this topic. Keep in mind that IMO Dennett never solved any problem. He only defines it away. So my perspective is very different from yours and some work may be required from you to communicate the point.

I appreciate your open-mindedness. I think the real barrier to your understanding my position is it's counter-intuitiveness. Intuitively, we "know" that consciousness is something special, produced by processes of the brain, and so it is counter-intuitive to say that that is wrong and that consciousness is an electrochemical process of the brain. IOW, we want to believe that phenomenological events "occur", even if not "really" occurring, but this cannot be so due to the homunculun problem and the problem of physical/non-physical cooperation (if the phenomenological events were physical then they would, by definition, take up space, and this cannot be so as we would have noticed by now (what, with all the thousands of thoughts I must be thinking right now)).
 
  • #35
Fliption, I think I found my analogy! (Please excuse any slight incoherence on my part, as I'm acting off inspiration, and I don't want to lose sight of the under-lying principle.)

Ok, there is a thread in the Biology Forum about animal testing, and it seems that there are many people who feel sympathy for the animals. Well, I don't feel such sympathy, and a couple of my aquaintances share my opinion, except they have added the reason that they don't believe animals feel pain! Oh, they believe that the animal has a constant bombardment of neuronal activity when put under "painful" circumstances, but they just don't believe that that neuronal activity in the animal translates to actual "pain".

Since my position is currently a completely Materialistic one, I am obliged to ask them what they feel the difference is, between excited neuronal activity and "actual pain". They confess they that don't know, but that that's just an example of our (humanity's) limited knowledge. However, if one applies the intentional stance to the circumstance, it all becomes clear.

The intentional stance dictates: If pain can be said to be equal to process Y of certain nerves, then all bodies whose nerves are undergoing process Y are in pain (no scare-quotes this time) - and, in fact, the two statements are equivalent ("certain nerves are undergoing process Y" and "the animal is in pain").

So, you see, this is how we differentiate between the propositions "process X produces C" (where "C" can refer to consciousness or pain or whatever other process we are discussing) and "process X is C" (such as when we establish that a certain excited nerve activity is pain).

I know that might be a little sloppy, and I'm going to re-read it once I post it, and edit as necessary...but I hope I got the point across.
 
Last edited:
  • #36
I think the intentional stance is an excellent principle for understanding the world, but it seems to be assuming what it shows in the case of animal pain. We say that nerves doing X = pain in ourselves because we know we work that way. But we can't say it of animals unless we can show that they work that way too. No?

I think animals clearly express pain (flinching, whining, etc.) and remember pain (a cat will avoid a stove after burning itself). So it is really stretching a point for your friends to say they don't experience pain.
 
  • #37
Originally posted by selfAdjoint
I think the intentional stance is an excellent principle for understanding the world, but it seems to be assuming what it shows in the case of animal pain. We say that nerves doing X = pain in ourselves because we know we work that way. But we can't say it of animals unless we can show that they work that way too. No?

Well, actually, the intentional stance (having postulated that X = pain) shows that any animal whose nerve undergo X is feeling pain (because the statement "my nerves are doing X" and "I'm feeling pain" are now identical).

I think animals clearly express pain (flinching, whining, etc.) and remember pain (a cat will avoid a stove after burning itself). So it is really stretching a point for your friends to say they don't experience pain.

Yeah, I mentioned that to them too (since they barely understood the intentional stance, no matter how hard I tried to explain it (I really need the PFs for intellectual stimulation, as most of my "friends" are lacking in this particular ability)).
 
  • #38
You and I both know that there is a standing logical (not scientific, not opinionated, not philosophical, but logical) problem with the mind being anything but the brain.

There are arguments against this. Hypnagogoe has done an excellent job(I understand it) of presenting them in another thread. Regardless, my initial comment here was to respond to the habitual substance lacking, foot stomping posts of certain individuals.


That's because, for some reason, I can't seem to make the homunculun problem intelligible to you. I even simplified (or, it was simpler in my opinion) the concept to asking "what good is a monitor inside your PC?", but that didn't seem to work. What is it exactly that you don't understand about the homunculun problem (or is it just the whole concept that seems obsolete to you)?

What I don't understand is exactly what you think it refutes. I don't even understand what point you think it's relevant to. So what if there is no internal viewer? What does this explain?

I have searched on the Inet to see what info is written on it and the only thing that comes close is something called the "homunculus" problem. I've read some of it and am still reading others but so far it doesn't seem to be used in philosophical discussions. The discussions are more scientific. I'm still looking at this though.




Look, the difference is this. If I say that brain A is performing process X (the question/answer + Multiple Drafts processes of Dennett's theory, for example), and is thus determinably conscious, I am not saying "process X indicates consciousness on the part of brain A", but rather "process X is all that there is to consciousness, and thus the statement 'brain A is performing process X' is precisely equal to (I can't seem to make the math symbol work here) the statement 'brain A is conscious'".


Yes, you've said this many times. I understand it and am not in any position to say it's wrong. But what difference it makes in dicussions we've had is what I'm struggling with. If I say riding a bicycle produces fun and you correct me by saying "No, bicycling does not produce fun, bicylcing is fun", how does this explain the nature/existence of fun? It seems as if you think making this statement makes the "mystery" of fun go away.


I appreciate your open-mindedness. I think the real barrier to your understanding my position is it's counter-intuitiveness. Intuitively, we "know" that consciousness is something special, produced by processes of the brain, and so it is counter-intuitive to say that that is wrong and that consciousness is an electrochemical process of the brain. IOW, we want to believe that phenomenological events "occur", even if not "really" occurring, but this cannot be so due to the homunculun problem and the problem of physical/non-physical cooperation (if the phenomenological events were physical then they would, by definition, take up space, and this cannot be so as we would have noticed by now (what, with all the thousands of thoughts I must be thinking right now)).

But that's just it. I do not hold the belief that consciousness is "special". Except for the fact that it has not been completely explained. I have felt from the very beginning that you were approaching your discussions with me with the opinion that I had an opposing view. If you'll notice I typcially insert myself into discussions where people quickly cast off all views but their own. It's because when I look at these things I truly have no idea(I get frustrated that the universe seems so complex). My view is that if someone is so certain, they should be able to convince me. If you know this about me, you can see why my patience is thin with certain people who know everything there is to know about reality and don't think anyone who disagrees with them is worth the time for an explanation (I'm not referring to you).

My biggest disagreement with you has been your view of what idealism/materialism is. Not really which one of them is correct. As I've said before, we haven't even started that discussion. But I will continue to point out in these discussions of materialism vs idealism when I think the disagreement/debating points are due to sloppy language and not a real philosophical issue. Because I still don't think the definitions are appropriate and are being used interchangably(unintentionally I'm sure) when it suits the view.
 
  • #39
Originally posted by Fliption
Except that this is a philosophy forum. And the point here is to actually explain and discuss why this is so. Not just claim it so.

Besides, I though sikz was more presenting the options than starting with a premise.
Buzz off! Shall we spend 50 pages discussing Santa Claus next?

*edit*

THe real problem with all of this is that it is self-centered feel-good nonsense, but still nonsense. We've gone over this before, and while Mentat is more eloquent in how he describes it, we are both on the same page(or at least reading from the same book) Emotions are refined pain/pleasure responses with survival benefits. It isn't magic, it doesn't come from the incoherent notion of a 'soul', it is part and parcel of brain function.
 
Last edited by a moderator:
  • #40
Originally posted by Mentat
Fliption, I think I found my analogy! (Please excuse any slight incoherence on my part, as I'm acting off inspiration, and I don't want to lose sight of the under-lying principle.)


Well this analogy really is just an example of what you're saying. But how does knowing this in this analogy help me make any conclusions about anything? How does this decide materialism or idealism? I guess maybe part of our confusion is that whether X is the effect of a process or IS that process doesn't mean it isn't completely material to me and to you it obviously does. This is purely a semantic matter and how you define "effect". So I've never understood how this was relevant in a discussion of materialism. That's why in the other thread I was was trying to get some agreement that physical processes are linked/caused by other physical processes.


Let me probe a little bit here with your analogy to make sure I understand your view. The first thing I thought was if what you say is true then how do you explain the difference between a patient that has been put to sleep during an operation and one that is awake. I was thinking that the nerve responses in the area of the operation(the hand, let's say) would be exactly the same in both patients, yet one of them feels pain and the other doesn't. Then I realized that you aren't confining process X to only the nerve process in the hand. You're including all the activity that the sleeping drug would have turned off in the brain as well. Is this correct?

While I think I understand what you're saying, I still need you to connect this analogy to how I can achieve some knowledge. How does what appears to be semantic gymnastics provide additional information? I'm still stuck on the thought "whether Y is an effect of process X or IS process X, Y still exists nonetheless and requires explanation. I can see that if you claim y to be equal to process X then a study of process X is all that is needed. But I would think that process X would have to account for specific characteristics of Y. I don't think we can just avoid associating characteristics of Y to process X just because we assume they are the same. Right? If so, then how does making this assumption save us any work in explaning anything if we still have to account for "consciousness".
 
Last edited:
  • #41
Originally posted by Mentat
Fliption, I think I found my analogy! (Please excuse any slight incoherence on my part, as I'm acting off inspiration, and I don't want to lose sight of the under-lying principle.)

Ok, there is a thread in the Biology Forum about animal testing, and it seems that there are many people who feel sympathy for the animals. Well, I don't feel such sympathy, and a couple of my aquaintances share my opinion, except they have added the reason that they don't believe animals feel pain! Oh, they believe that the animal has a constant bombardment of neuronal activity when put under "painful" circumstances, but they just don't believe that that neuronal activity in the animal translates to actual "pain".

Since my position is currently a completely Materialistic one, I am obliged to ask them what they feel the difference is, between excited neuronal activity and "actual pain". They confess they that don't know, but that that's just an example of our (humanity's) limited knowledge. However, if one applies the intentional stance to the circumstance, it all becomes clear.

The intentional stance dictates: If pain can be said to be equal to process Y of certain nerves, then all bodies whose nerves are undergoing process Y are in pain (no scare-quotes this time) - and, in fact, the two statements are equivalent ("certain nerves are undergoing process Y" and "the animal is in pain").

So, you see, this is how we differentiate between the propositions "process X produces C" (where "C" can refer to consciousness or pain or whatever other process we are discussing) and "process X is C" (such as when we establish that a certain excited nerve activity is pain).

I know that might be a little sloppy, and I'm going to re-read it once I post it, and edit as necessary...but I hope I got the point across.
I thnk this certainly points out a double standard among some folks, in that they feel that humans have "souls", and animals don't, and therefore there is a qualitative difference in reactions to identical stimulus.
 
  • #42
Originally posted by Zero
Buzz off! Shall we spend 50 pages discussing Santa Claus next?

*edit*

THe real problem with all of this is that it is self-centered feel-good nonsense, but still nonsense. We've gone over this before, and while Mentat is more eloquent in how he describes it, we are both on the same page(or at least reading from the same book) Emotions are refined pain/pleasure responses with survival benefits. It isn't magic, it doesn't come from the incoherent notion of a 'soul', it is part and parcel of brain function.

Who are you debating with? The skeleton in your closet? It certainly isn't me since I don't hold any of these beliefs. Unless you can understand what's going on and productively contribute, I think "Buzz off" is certainly relevant.
 
  • #43
Originally posted by Fliption
Who are you debating with? The skeleton in your closet? It certainly isn't me since I don't hold any of these beliefs. Unless you can understand what's going on and productively contribute, I think "Buzz off" is certainly relevant.
No, you hold no beliefs, you are a completely empty slate, with no opinions or viewpoints...(you are completely full of it, and you know it!*grins*)

Anyways, the "buzz off" was for you, since you, as a blank slate, can comtribute nothing, right? The rest was just my feelings on this thread, not directed towards you, since you have no opinions on anything.
 
  • #44
Originally posted by Zero
No, you hold no beliefs, you are a completely empty slate, with no opinions or viewpoints...(you are completely full of it, and you know it!*grins*)

Anyways, the "buzz off" was for you, since you, as a blank slate, can comtribute nothing, right? The rest was just my feelings on this thread, not directed towards you, since you have no opinions on anything.

I do have opinions. Having opinions and knowing them to accurately reflect reality are two different things. I lean certain ways on things and then as I read and learn more, I reconsider that leaning. I honestly wish I could stomp my feet on one view and say it is true. But I cannot. I envy all of you religious/dogmatic people who can.

And you of all people should know that when you respond with someones quote attached, it is that person you are responding to and not the general population. So since your comments aren't relevant to my points, I'll just assume you aren't capable of responding to them specifically.
 
  • #45
Originally posted by Fliption
I do have opinions. Having opinions and knowing them to accurately reflect reality are two different things. I lean certain ways on things and then as I read and learn more, I reconsider that leaning. I honestly wish I could stomp my feet on one view and say it is true. But I cannot. I envy all of you religious/dogmatic people who can.

And you of all people should know that when you respond with someones quote attached, it is that person you are responding to and not the general population. So since your comments aren't relevant to my points, I'll just assume you aren't capable of responding to them specifically.
I didn't think you had points, because points would assume you had taken a position, which you go out of your way to deny ever doing.
 
  • #46
Originally posted by Zero
I didn't think you had points, because points would assume you had taken a position, which you go out of your way to deny ever doing.


So you're saying that if I have no opinion on God and a person who is trying to convince me that god exists, then I cannot explain to him why his argument is unconvincing? Must I hold a belief in no god to do this? This is nonsense and inflamatory. Please move on if you can't contribute.
 
  • #47
Originally posted by Fliption
So you're saying that if I have no opinion on God and a person who is trying to convince me that god exists, then I cannot explain to him why his argument is unconvincing? Must I hold a belief in no god to do this? This is nonsense and inflamatory. Please move on if you can't contribute.
Uh huh...so why do you continue to post to me, just to tell me you don't like the way I post?
 
  • #48
Now, for those of us who have a position...LOL





Awareness, consciousness, and emotion are all evolutionally beneficial refinements of the same neurological activity that exists in all animals. There is no evidence, and little logic(according to Mentat's posts) to the idea that consciousness and emotion exist on some plane besides the purely physical.
 
  • #49
Originally posted by Zero
Uh huh...so why do you continue to post to me, just to tell me you don't like the way I post?

Absolutely. I squash bugs when they come into my house too.
 
  • #50
Originally posted by Zero
Now, for those of us who have a position...LOL

Now if only we could get you to post it in the right forum.
 
  • #51
Originally posted by Mentat
Consciousness altogether - i.e. "awareness" - is possible in some of the more rudimentary of life-forms. It's consciousness of consciousness (also referred to as self-consciousness) and the ability for analysis that requires a CPU something like ours.

Again, nice hypothesis, but that's all it is. It's something that sounds like it makes sense, but must be tested empirically to see if it is accurate or not. It's certainly not a well-established fact.

Anyway, you need to remember that, in the intentional stance, there is no distinction between the processing and the consciousness. So, such questions as "can something follow MD processing and not be conscious" or "can something be conscious inspite of not following MD" are really non-sequitors - and would be much like asking "can something be conscious without being conscious" :wink:.

This just takes us back to what I said in my last post. You're treating this MD hypothesis as if it is established fact, without having empirical justification to do so.

This hypothesis must be proved empirically before you go about applying its logic with certainty. Until that point, you must recognize that questions like "can something follow MD processing and not be conscious?" are valid inquiries.

No we wouldn't. I think there is a concept that is rather ingrained in your mind (and in most human minds) that doesn't allow for the heterophenomenological approach, but there is really nothing illogical about it. IOW, there is nothing wrong with assuming, not that "if A has physical properties X, then A is conscious", but rather "if A has physical properties X, then A has consciousness, because consciousness = X.

Consciousness = subjective, qualitative awareness. This is the proposition we must start from, if we are to get a meaningful understanding of consciousness. If you redefine consciousness from the start to merely be these physical properties X, then I'm afraid you're not talking about the same thing everybody else is when they say "consciousness."

Again, I am not stating that the intentional stance as such is wrong. Rather, I am stating that it must be subject to empirical verification; it cannot simply be defined a priori that such and such physical property "simply is" conscious. We must demonstrate that such and such property is conscious before we can make this statement.

So, we must be able to establish a relationship such as "physical properties X = subjective, qualitative awareness." We cannot do this by redefining terms to make our lives easier. Rather, we must empirically test to see if, indeed, physical properties X always entail the subjective experience of qualia Y.

Again, this is most emphatically a question of empirically finding physical correlates of consciousness, NOT redefining consciousness to mean those physical properties that we suspect are involved in such and such subjective experience.

But that's the whole point, we are starting from the intentional stance. Besides, and I want to be clear on this: We are not seeking empirical verification of subjective consciousness - this would imply an "apparatus" of consciousness - we are searching merely for the "apparatus" since that is consciousness[/color].

But you can't know a priori which "apparatus" is consciousness and which isn't. Thus, you must empirically test to see which "apparati" are consciousness and which are not.

Not true...at least, not according to the heterophenomenological approach that Dennett takes (I can't state absolute truths), since we can determine the consciousness of any physical system using knowledge of MD and question/answer processes (if Dennett is right, that is).

That's a mighty big "if," Mentat. :wink: How do we change that "if" to "since"? Only by empirically testing the hypothesis!

But the dualism will always be implied when you separate "subjective experience" from the nitty-gritty of neurological science (the electrochemical processes themselves).

I am not saying that is truly a separation between subjective experience and objective physical processes. Rather I am saying there is a dissociation in our epistemic connection between the two, since we still do not have a complete picture of how objective processes map onto subjective experiences.
 
  • #52
Originally posted by hypnagogue
Consciousness = subjective, qualitative awareness. This is the proposition we must start from, if we are to get a meaningful understanding of consciousness. If you redefine consciousness from the start to merely be these physical properties X, then I'm afraid you're not talking about the same thing everybody else is when they say "consciousness."

it cannot simply be defined a priori that such and such physical property "simply is" conscious. We must demonstrate that such and such property is conscious before we can make this statement.

I have selected these quotes just to reiterate to Mentat that these words from Hypnagogue are the exact same point I'm trying to make. We see this the same way. This is why I have said that these statements attempt to "define the problem away" as opposed to dealing with it.
 
Last edited:
  • #53
Originally posted by hypnagogue
Again, nice hypothesis, but that's all it is. It's something that sounds like it makes sense, but must be tested empirically to see if it is accurate or not. It's certainly not a well-established fact.

Certainly not, however, it's the best I've seen. You see, Dennett cannot postulate that his most definitely is the theory of consciousness - in fact, his theory is structured so as to be slightly incomplete, in the most nitty-gritty of details, so that he doesn't make any detailed scientific predictions (which, when proven wrong, will make people think the whole concept is wrong) - however I have used his reasoning to dismantle every other theory of consciousness I've ever heard of (and I've been doing some reading on this recently), unless it's basically a restatement of his own (which conclusion the author is surely not aware of, but which I have found to be true after analysis).

This just takes us back to what I said in my last post. You're treating this MD hypothesis as if it is established fact, without having empirical justification to do so.

Alright, I apologize if I sounded over-confident, but can you see an actual flaw in the theory?

This hypothesis must be proved empirically before you go about applying its logic with certainty. Until that point, you must recognize that questions like "can something follow MD processing and not be conscious?" are valid inquiries.

No they are not. The concept of doing process X and still not being conscious isn't at all compatible with the MD theory. If you are going to study Dennett's theory then those questions become non-sequitors...if, OTOH, you are going to ask those questions regardless of a very nice theory that side-steps them, that's a completely different matter.

Consciousness = subjective, qualitative awareness. This is the proposition we must start from, if we are to get a meaningful understanding of consciousness. If you redefine consciousness from the start to merely be these physical properties X, then I'm afraid you're not talking about the same thing everybody else is when they say "consciousness."

Not true. If being "conscious" is nothing more than having process X occur in your brain, then I am indeed talking about the same things that every one else is talking about when they say "consciousness", because "subjective awareness" = function X.

Again, I am not stating that the intentional stance as such is wrong. Rather, I am stating that it must be subject to empirical verification; it cannot simply be defined a priori that such and such physical property "simply is" conscious. We must demonstrate that such and such property is conscious before we can make this statement.

Hold on now, I really hope you haven't just barely missed the point again. The process is not conscious (IOW, consciousness is not a property of the process), the process is consciousness.

So, we must be able to establish a relationship such as "physical properties X = subjective, qualitative awareness." We cannot do this by redefining terms to make our lives easier. Rather, we must empirically test to see if, indeed, physical properties X always entail the subjective experience of qualia Y.

What? Physical properties X have already been observed to be (note, "be", not produce) the process of being conscious of the external world. It is not so large a step to say that this is also how it happens when there is no external (outside of the brain) stimulus.

Again, this is most emphatically a question of empirically finding physical correlates of consciousness, NOT redefining consciousness to mean those physical properties that we suspect are involved in such and such subjective experience.

They are not involved in subjective experience, they are subjective experience. I know that's not been proven, but it is not reasonable to say "you haven't proven it, so it's not true" (though that may not be what you are doing, it's starting to seem like it).

But you can't know a priori which "apparatus" is consciousness and which isn't. Thus, you must empirically test to see which "apparati" are consciousness and which are not.

I don't understand this. If you can determine which apparati are involved in experiencing the objective world, then you have identified the ones that are involved in what we consider the production of a subjective world (provided you add memory of previous stimulation).

That's a mighty big "if," Mentat. :wink: How do we change that "if" to "since"? Only by empirically testing the hypothesis!

Fine, but at least Dennett's theory (or modifications thereof) can be empirically tested, as opposed to ones that require internal viewers and non-physical objects and other such impossibilities.

I am not saying that is truly a separation between subjective experience and objective physical processes. Rather I am saying there is a dissociation in our epistemic connection between the two, since we still do not have a complete picture of how objective processes map onto subjective experiences.

I really don't like that last statement...but you can probably already respond in your own mind as you know I would, and I have to get off-line right now.
 
  • #54
I seen a very interesting program on Spanish television the other night that i would like to share with whoever reads this. I am sure many have heard of out of the body experiences of people who have a heart attack and die and go into the tunnel of light. That is lung heart and brain waves cease, clinically dead. There was one experience in particular that may shed some light on what we are discussing. A blind women who never saw in her life died for one hour and returned to tell her story. She described in full and complete detail her intervention to try and save her life, the people who were in the hospital, the city she hovered over and birds flying through the air ect. This is proff in itself that consiousness is not only in the brain or body but is an entity also apart. A individual consciousnes when in a body can feel emotions but so can it also when outside of the body. It then appears that the body is only an instument to manifest and move arround in the physical plane that we live in. How is it possible to know that we feel emotion consciously? By having someone account there experience from there consciousnes when both live and dead.
 
  • #55
Originally posted by Rader
I seen a very interesting program on Spanish television the other night that i would like to share with whoever reads this. I am sure many have heard of out of the body experiences of people who have a heart attack and die and go into the tunnel of light. That is lung heart and brain waves cease, clinically dead. There was one experience in particular that may shed some light on what we are discussing. A blind women who never saw in her life died for one hour and returned to tell her story. She described in full and complete detail her intervention to try and save her life, the people who were in the hospital, the city she hovered over and birds flying through the air ect. This is proff in itself that consiousness is not only in the brain or body but is an entity also apart. A individual consciousnes when in a body can feel emotions but so can it also when outside of the body. It then appears that the body is only an instument to manifest and move arround in the physical plane that we live in. How is it possible to know that we feel emotion consciously? By having someone account there experience from there consciousnes when both live and dead.
The problem is, these are just unconfirmed stories, and we shouldn't just accept them at face value. There is no reputable scientific evidence that these 'out of body experiences' actually occur.
 
  • #56
Originally posted by Zero
The problem is, these are just unconfirmed stories, and we shouldn't just accept them at face value. There is no reputable scientific evidence that these 'out of body experiences' actually occur.

All data gained should be scientific. We should play the devils advocate. You can be a douubting Thomas. This was a special case and I have listend to very many. This was a controlled experiment. The blind women was under strict control by doctors and scientist, as she had a large chance of dying and if she made it through would possible be able to documnent her experience in the white tunnel. She had a brain tumor and was operated on. She died for one hour as the operation went on. Clinically dead one hour not 3 minutes. No lung heart or brain funtion for one hour. She was blind and could not see ever yet could tell the doctors and scientist the brand name stamped on the scapels used in her intervention. She heard saw and felt them cutting into here cranium. She documented precise moments during the operation by the clock in the room. Really>>>> How much more proof do you want.
 
  • #57
Originally posted by Rader
All data gained should be scientific. We should play the devils advocate. You can be a douubting Thomas. This was a special case and I have listend to very many. This was a controlled experiment. The blind women was under strict control by doctors and scientist, as she had a large chance of dying and if she made it through would possible be able to documnent her experience in the white tunnel. She had a brain tumor and was operated on. She died for one hour as the operation went on. Clinically dead one hour not 3 minutes. No lung heart or brain funtion for one hour. She was blind and could not see ever yet could tell the doctors and scientist the brand name stamped on the scapels used in her intervention. She heard saw and felt them cutting into here cranium. She documented precise moments during the operation by the clock in the room. Really>>>> How much more proof do you want.
That story is patently false.
 
  • #58
Originally posted by Zero
That story is patently false.

Might as well tell it how it is, eh Zero? :smile:
 

Similar threads

  • · Replies 20 ·
Replies
20
Views
4K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 62 ·
3
Replies
62
Views
12K
Replies
6
Views
2K
  • · Replies 23 ·
Replies
23
Views
2K
Replies
5
Views
3K
Replies
5
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 14 ·
Replies
14
Views
2K