Ferris_bg
- 88
- 0
Pythagorean,
I consider the case, where a change in the physical state brings a change in the mental state.
Imagine John, who has fully recovered from http://www.scholarpedia.org/wiki/images/2/2c/Image1.jpeg" . Let's say his physical state before the injury was P1, during the coma P2 and after that P3 (P1 -> P2 -> P3). During the coma there was brain reorganization going on (P1 -> P2a -> P2b -> P2c -> ... -> P3). John was unconscious in P2 (meaning no M2 or Q), which makes it irrelevant to the discussion - we can simply consider the case P1 -> P3.
The idea of my writings in post #22 is that if we want to have epiphenomenal qualia in reductionist view, we are faced with two incoherent options:
1) a mental state M can't be a subset of Q, thus at time t we have multiple mental states
2) every time a Q state appears, the current M state must be interrupted
Q_Goest,
Let's first say that when we speak about http://www.iep.utm.edu/functism/" - "Our belief that our fellow human beings have a mental life similar to ours is justified by an argument from analogy". So we can either find an algorithm for predicting a system's degree of consciousness or we are left guessing.
I consider the case, where a change in the physical state brings a change in the mental state.
Imagine John, who has fully recovered from http://www.scholarpedia.org/wiki/images/2/2c/Image1.jpeg" . Let's say his physical state before the injury was P1, during the coma P2 and after that P3 (P1 -> P2 -> P3). During the coma there was brain reorganization going on (P1 -> P2a -> P2b -> P2c -> ... -> P3). John was unconscious in P2 (meaning no M2 or Q), which makes it irrelevant to the discussion - we can simply consider the case P1 -> P3.
The idea of my writings in post #22 is that if we want to have epiphenomenal qualia in reductionist view, we are faced with two incoherent options:
1) a mental state M can't be a subset of Q, thus at time t we have multiple mental states
2) every time a Q state appears, the current M state must be interrupted
Q_Goest,
Let's first say that when we speak about http://www.iep.utm.edu/functism/" - "Our belief that our fellow human beings have a mental life similar to ours is justified by an argument from analogy". So we can either find an algorithm for predicting a system's degree of consciousness or we are left guessing.
Last edited by a moderator: