The Implications of Materialist Consciousness on Telepathy

  • Thread starter Thread starter Mentat
  • Start date Start date
  • Tags Tags
    Consciousness
AI Thread Summary
The discussion centers on the compatibility of materialist views of consciousness with the concept of telepathy. Participants argue that if consciousness is solely a product of brain processes, then telepathy, which implies a non-physical connection between minds, cannot exist. Some suggest that the subconscious functions as part of the brain's processes, negating the need for an internal mind. Others introduce the idea of a "higher self" or subliminal consciousness, proposing that insights can emerge from a non-material aspect of awareness. Ultimately, the conversation highlights the tension between materialist interpretations of consciousness and the possibility of non-physical phenomena like telepathy.
  • #51
Originally posted by Mentat
But that is not Idealistic, in anyway. On the contrary, it is utterly Materialistic, since it completely eliminates "non-physical" parts of consciousness.

The functionalist position is not materialistic in any meaningful sense, insofar as it denies the importance of material properties in producing consciousness. According to the functionalist view, any physical process embodying the right 'calculations' will be conscious. This holds for a human brain, or a super-fast computer simulating a human brain, or a pile of rocks jumbled around over eons whose 'calculations' also simulate a human brain. There is something quite physically arbitrary in the notion of 'calculation.' Functionalism states that an abstract relationship among physical things is responsible for consciousness; to formulate a materialist understanding of consciousness, we must show consciousness to be grounded in at least some actual physical properties, and not entirely on abstract and essentially arbitrary relationships.

But it did participate with memory, and that's what's really important. The fact these people are impared in their visual centers' abilities to "question" the memory, in certain instances, should be what accounts for "blindsight".

Not so. Please check out the 'consciousness' thread in the Biology forum.

As to the rest of your points, I still don't think they are striking at the heart of the matter; they attempt to give descriptions of consciousness but nonetheless speak only of material concepts, and not how those material concepts are to be conceptually linked with qualitative mental phenomena in any meaningful sense beyond saying "these processes just are consciousness." This is really the central point where you and I have not been able to see eye to eye.

In reading "What is it like to be a bat?" by Thomas Nagel for the aforementioned consciousness thread, I see that Nagel discusses exactly the sort of conceptual difficulty that I have been talking about. Please read over this essay, and maybe you will better understand it via Nagel's language.

http://members.aol.com/NeoNoetics/Nagel_Bat.html
 
Physics news on Phys.org
  • #52
Originally posted by hypnagogue
The functionalist position is not materialistic in any meaningful sense, insofar as it denies the importance of material properties in producing consciousness. According to the functionalist view, any physical process embodying the right 'calculations' will be conscious. This holds for a human brain, or a super-fast computer simulating a human brain, or a pile of rocks jumbled around over eons whose 'calculations' also simulate a human brain. There is something quite physically arbitrary in the notion of 'calculation.' Functionalism states that an abstract relationship among physical things is responsible for consciousness; to formulate a materialist understanding of consciousness, we must show consciousness to be grounded in at least some actual physical properties, and not entirely on abstract and essentially arbitrary relationships.

Well, I agree that it's supposed to be grounded in some physical properties, but functionalism is not wrong in saying that anything that processes like the human brain will be conscious. However, "processing" or "calculating" are hard concepts (without the Multiple Drafts (and question/answer) model, or some other alternate model, to explain how we calculate).

Not so. Please check out the 'consciousness' thread in the Biology forum.

As to the rest of your points, I still don't think they are striking at the heart of the matter; they attempt to give descriptions of consciousness but nonetheless speak only of material concepts, and not how those material concepts are to be conceptually linked with qualitative mental phenomena in any meaningful sense beyond saying "these processes just are consciousness." This is really the central point where you and I have not been able to see eye to eye.

In reading "What is it like to be a bat?" by Thomas Nagel for the aforementioned consciousness thread, I see that Nagel discusses exactly the sort of conceptual difficulty that I have been talking about. Please read over this essay, and maybe you will better understand it via Nagel's language.

http://members.aol.com/NeoNoetics/Nagel_Bat.html

Will do. Until then, I want to remind you that the intentional stance (which Dennett advocates) requires that one accept that such physical processes are consciousness, as opposed to "producing" consciousness, or "being linked to" consciousness. If there is nothing else to consciousness (and I see no reason, yet, why there should be) then it no longer needs to be a "mystery".
 
Back
Top