Register to reply

What or where is our real sense of self?

by Metarepresent
Tags: consciousness, qualia, recursion, representation
Share this thread:
Q_Goest
#37
Feb13-11, 08:29 AM
Sci Advisor
HW Helper
PF Gold
Q_Goest's Avatar
P: 2,906
Quote Quote by Metarepresent View Post
Also, before continuing this long diatribe: what are the best phenomenal characteristics we may attribute to consciousness? I believe unity, recursive processing style and egocentric perspective are the best phenomenal target properties attributed to the ‘self’.
Hi Meta. I must have missed this one. What Ramachandran and others are referring to when they refer to the "phenomenal characteristics" of consciousness is in fact what I was referring to previously. Phenomenal consciousness regards the phenomenal characteristics, what something feels like, the experience of it, the subjective experience that we have. The sense of unity is a phenomenal characteristic in the sense that it is a phenomena that is subjective and it is a feeling or experience. I don’t know exactly what you mean by recursive processing style but I presume this regards how neurons interact and/or the various higher order synaptic connections within the brain. If that’s the case, then this wouldn’t be a phenomenal characteristic of consciousness, it would be a psychological one (for lack of a better description) since it would be objectively definable and knowable. How neurons interact and give rise to conscious experience or metarepresentations is not a phenomenal characteristic of the brain or mind. The experience itself and metarepresentations are phenomenal characteristics. But how neurons interact to give rise to these phenomenal characteristics is not. They are purely neurological characteristics that can be objectively defined. Clearly, Ramachandran takes this view. See for example this YouTube video:
http://www.youtube.com/watch?v=jTWmTJALe1w

Quote Quote by Metarepresent View Post
Yeah, I adopt the stance of identity theory:
the theory simply states that when we experience something - e.g. pain - this is exactly reflected by a corresponding neurological state in the brain (such as the interaction of certain neurons, axons, etc.). From this point of view, your mind is your brain - they are identical" http://www.philosophyonline.co.uk/po...ntity_what.htm
A neural representation is a mental representation.

No, he is claiming it is with the manipulation of metarepresentations ('representations of representations') that we engage in consciousness. Many other other species have representational capacities, but not to the extent of humans.
That’s correct. Ramachandran takes a very basic, uncontroversial computationalist approach to consciousness. I think first you need to clarify the difference between phenomenal characteristics though. Ramachandran is pointing out that certain circuits in the brain create phenomenal consciousness and it is these circuits that are ‘in control’ in some fashion. Further, he is suggesting as I’m sure you are aware, that there is what’s commonly referred to as “a one-to-one” relationship between the neuronal activity and the phenomenal experience that is responsible for this mental representation. Ramachanadran takes the fairly uncontroversial stance that the overall phenomenal experience (and the many parts of that experience) is what equates to the mental representation.

All this isn’t to say that our present approach to consciousness is without difficulties as Ramachandran points out in the video. Some very serious issues that pop out of this approach include “mental causation” and the problems with downward causation (ie: strong downward causation) and the knowledge paradox. So once we understand the basic concept of mental representation as presented for example by Ramachandran, we then need to move on to what issues this particular model of the human mind gives rise to and look for ways to resolve those issues.
ConradDJ
#38
Feb13-11, 09:23 AM
PF Gold
P: 302
Quote Quote by JDStupi View Post
It seems as though some are speaking about a specifically human sense of self, the one we are aquainted with and experience, and are speaking about the necessary conditions for the existence of our sense of self, not neccessarily any sense of self.

.... On one hand, when we introspect or look at our sense of self we find that our I is built upon a vast network of memories, associative connections, generalized concepts, a re-flective (to bend back upon) ability, and embedding within a specific socio-cultural milieu. On the other hand we see that animals possess subjective states such as pain, pleasure etc etc, which leads us to believe that animals too have an I The problem then is in what sense is the animal I and the human I the same in meaning or reference?
Quote Quote by Lievo View Post
So the question: is the hard problem hard because of what is making our sense of self specific to humans, or because of the part shared with many animals?

... The problem with any approach putting to much emphasized on langage is that all what make the consciousness a hard problem is already present when we're looking at this 'just' conscious phenomena...

My comments were certainly aimed at what’s specifically human in “consciousness” and “sense of self.” I want to explain why that seems to me the fundamental issue and how it is bound up with language. I’m sorry for the length of this post... I’d be briefer if I thought I could be clear in less space.

First, I would say the “hard problem” is hard only because of our confusion between what’s specific to humans and what we share with other animals. When we treat “consciousness” as synonymous with “awareness” and talk about animals “possessing subjective states such as pain, pleasure, etc” or “having an ‘I’” we’re doing something that’s very basic to human consciousness – i.e. projecting our own kind of awareness on others, imagining that they see and feel and think like we do.

This is basic to our humanity because it’s the basis of human communication – experiencing other people as “you”, as having their own internal worlds from which they relate to us as we relate to them. This is what’s sometimes called “mindreading” in psychology. And if you go back to a primitive enough level, it’s something we share with other mammals, who are certainly able to sense and respond to what another animal is feeling.

This is one of a great many very sophisticated, highly evolved functions of the mammalian brain, that let animals focus their attention according to all kinds of very subtle cues in their environment. These abilities are quite amazing in themselves, but if you talk to biologists or ethologists I doubt you’ll find many who believe there is a “hard problem” here. It’s clear why these capacities evolved, it’s clear how evolution works, and there’s no mystery any more about the relationship of evolving self-reproducing organisms to the physical world.

Now what’s special about humans is not our neural hardware but the software we run on it, namely human language. It’s not at all similar to computer software (and our neural hardware is equally different from computer hardware) – but even so the term is a good one, because it distinguishes between what’s “built in” to our brains genetically and what’s “installed” from the outside, through our interpersonal relationships, as we grow up.

Part of the problem we have in understanding the situation with “consciousness” is that by now this software has evolved to become extremely sophisticated – it gives us all kinds of tools for directing our attention and responding to our environment that go far beyond what other animals do. And we were already very well versed in using much of this attention-focusing (“representing”) software when we were only 3 years old, long before we ever began to think about anything. We learned to use words like “you” and “me” long before we had any capacity for reflection.

So one lesson is – there’s no absolute difference between humans and other animals, just as there’s none between living beings and non-living matter. In both cases there comes to be a vast difference, because of a long, gradual evolutionary process that developed qualitatively new kinds of capacities.

Another lesson is – when we try to think reflectively about ourselves and the world, we’re using sophisticated conceptual tools that are built on top of even more sophisticated linguistic software, through which we interpret the world unconsciously, i.e. automatically. For example, we automatically tend to project our own kind of awareness on other people and animals and even (in many pre-industrial cultures) on inanimate objects in the world around us.

So as to the “hard problem” – when we humans look around and “perceive” things around us, we’re just as completely unconscious of the vast amount of linguistic software-processing involved, as we are of the vast amount of neural hardware-processing that’s going on. It’s very easy to imagine we’re doing something like what a cat does when it looks around, “just seeing.” And if we start thinking about it reflectively, we may very easily convince ourselves that the cat “must have an internal subjective world of sensation” or even “a self.” And now we do have a hard problem, one that will never be solved because its terms will never even be clearly defined.

We end up talking about “what it’s like to be a bat,” for example, as if that clarified the issue. But the difference between what it’s like to be you and what it’s like to be a non-talking animal is on the same order of magnitude as the difference between a stone and a living cell. It’s just that we’re not as far along at understanding our humanity as we are at understanding biology.
ConradDJ
#39
Feb13-11, 10:27 AM
PF Gold
P: 302
Quote Quote by JDStupi View Post
The hard problem being based around the divide between "what it feels like" and what its "objective" manifestations are, the animal most likely does have qualia and as such there is a hard problem for it as there is for us.

Here, I do not claim to have more knowledge of what constitutes "awareness", and the relationships between "awareness" and "subjectivity", but it does seem that we must trace it biologically much further down than humans in order to properly understand it.

I agree that there's an aspect of the subject/object divide that goes back a long way in evolution. There’s a basic difference in perspective between seeing something “from outside” – the way we see objects – and seeing “from inside”, from one’s own point of view.

In fact, I think this kind difference is important even in physics. Physics has it’s own “hard problem” having to do with the role of “the observer” in quantum mechanics. It seems that not even the physical world is fully describable objectively, “from outside” – at a fundamental level, it seems that you have to take a point of view inside the web of physical interaction in order to grasp its structure.

So it may well turn out be meaningful not only to talk about the point of view of an animal, but the point of view of an atom.

The problem with treating this as a problem of “consciousness” – as even some reputable physicists do, sadly – is what I was trying to get at in my previous post. When we do that, we unconsciously import all kinds of unstated assumptions that an animal’s point of view on the world or an atom’s must be similar to our own.

Before anyone begins to ask questions about the relation of our conscious experience to that of an animal, I would recommend that they spend some time thinking about the very great differences that exist between the conscious experience of humans, say, in oral cultures and in literate culture. (Look up Walter Ong’s work or Eric Havelock’s, for example.) That helps give a sense of how radically different one’s “consciousness” and “sense of self” can be, even when the underlying language has hardly changed.

So the gist of my position is not that only humans have some sort of “internal life” going on in their brains. It’s that we tend to use terms like “consciousness” and “subjectivity” and “representation” to cover a vast range of very different things. If we want to understand these issues better, it’s the differences we should be focusing on.
JDStupi
#40
Feb13-11, 11:25 AM
P: 111
So the gist of my position is not that only humans have some sort of “internal life” going on in their brains. It’s that we tend to use terms like “consciousness” and “subjectivity” and “representation” to cover a vast range of very different things. If we want to understand these issues better, it’s the differences we should be focusing on.
Certainly, this is exactly what I was getting at, and I completely agree. When I asked in what way the two "I"'s were the same in meaning or reference, essentially what I was getting it is what you said: that the ideas of "I","awareness", and "self" are not particulars we refer to with established defintions , but rather broad classes of things. We cannot seek some one explanatory principle that will illuminate the "nature of consciousness/awareness" and attempt to apply it to everything, for the sole reason that "something that explains everything, explains nothing", everything will just blend together in some indistinguishable blob, rather than being able to see the differences and their working interrelations and from this further sharpen our ideas on these subjects.


The problem with treating this as a problem of “consciousness” – as even some reputable physicists do, sadly
I agree with this also, those physicists who jump from the measurement problem's interactive aspects directly to "consciousness must be the source of weirdness" are making terribly large unjustified jumps and are suffering terribly from "Good Science/Bad Philosophy". But we shouldn't get into that here.

So as to the “hard problem” – when we humans look around and “perceive” things around us, we’re just as completely unconscious of the vast amount of linguistic software-processing involved, as we are of the vast amount of neural hardware-processing that’s going on. It’s very easy to imagine we’re doing something like what a cat does when it looks around, “just seeing.” And if we start thinking about it reflectively, we may very easily convince ourselves that the cat “must have an internal subjective world of sensation” or even “a self.”
This much I agree with also. In principal, I agree that there exists a "hard problem of consciousness" for both humans and animals, because I do believe that animals have qualia and internal states. Though, what you touch upon is that methodologically, in our present state of knowledge, it would be silly to attempt to come up with a defintive criterion for "awareness" and "self" and the like, that we must attempt to come up with some means of description that is somewhat independant of our folk-psychological notions of consciousness. These questions will only be answered by examining the many variegated processes that occur at all levels of biological organization, and moreover, studying specifically human forms of consciousness will most certainly shed light on the issues, because we will come to understand the nature of our perception better, and the ideas generated in the study of the emergence of human awareness and "self" will IMO undoubtedly prove usefull for ideas in other biological fields.

And yes, taking cues from older Gestalt views on psychology, the nature of our internal awareness is structured in various gestalts, our perception of the moment is not the sum total of external qualia, but rather an integrated gestalt which is the structure of our internal experience. Drastically different cultures which focus on drastically different aspects of human life will almost certainly have a completely different structuring of their internal perceptual manifold. This is relevant because, how could we hope to understand the nature of "what it is like to be a bat" from an internal perspective if we can barely understand what it is like to be in a completely different culture, or even what it is like to be the guy next door. This, I believe, is simply an unsolvable problem.
apeiron
#41
Feb13-11, 01:19 PM
PF Gold
apeiron's Avatar
P: 2,432
Quote Quote by Lievo View Post
You claimed to have a specific argument against a systems approach. Let's hear it. Or if you want to evade the question by pointing at a page of links, identify the paper you believe supplies the argument.

Quote Quote by Lievo View Post
First news the current practices are based on a philosophical system approach! I'd have say it is based on common sense and that no one give 'the system approach ' a **** when it comes to medical practices. Always happy to learn something.
Again you supply your personal and unsupported reaction against an actual referenced case where people had to sit down and come up with a valid position on a fraught issue. It may indeed seem common sense to then arrive at a systems definition of consciousness here, but that rather undermines your comment, doesn't it?
apeiron
#42
Feb13-11, 01:43 PM
PF Gold
apeiron's Avatar
P: 2,432
Quote Quote by ConradDJ View Post
So the gist of my position is not that only humans have some sort of “internal life” going on in their brains. It’s that we tend to use terms like “consciousness” and “subjectivity” and “representation” to cover a vast range of very different things. If we want to understand these issues better, it’s the differences we should be focusing on.
This is precisely what people who want to debate consciousness should demonstrate a grasp of - the rich structure of the mind.

The hard problem tactic is to atomise the complexity of phenomenology. Everything gets broken down into qualia - the ineffable redness of red, the smell of a rose, etc. By atomising consciousness in this way, you lose sight of its actual complexity and organisation.

Robert Rosen called it isomerism. You can take all the molecules that make up a human body and dissolve them in a test tube. All the same material is in the tube, but is it still alive? Clearly you have successfully reduced a body to its material atoms, its components, but have lost the organisation that actually meant something.

Philosophy based on qualia-fication does the same thing. If you atomise, you no longer have a chance of seeing the complex organisation. So while some find qualia style arguments incredibly convincing, this just usually shows they have not ever really appreciated the rich complexity of the phenomenon they claim to be philosophising about.
Lievo
#43
Feb13-11, 11:26 PM
P: 268
Quote Quote by ConradDJ View Post
My comments were certainly aimed at what’s specifically human in “consciousness” and “sense of self.”
Sure, which is to me the problem. Interesting post. If I may reformulate, you're saying that we should not anthropomorphise: it's not because my brain interpret an animal as having a feeling that the animal has this feeling -my brain doesn't know what's going inside, it just interprets the external signs as human like because of the way it is wired.

Fair enough. This is not attackable in any way -which is why it's the hard problem again. But when should we stop this line of thinking? By the very same logic, you don't know that I or any one but you is conscious. You're making an analogy with yourself, and you may well be wrong to.

Of course I don't believe this solipsism, but it's exactly the same logic for chimps. So where do we stop? At this point, you would probably say: hey but humans are humans: we know that's the same kind of brain so for our species it's correct to reject solipsism.

I like this way of thinking, and would agree I don't know what it's like to be a bat. But I'd argue that both the behavior and the brain of the Chimps are so similar to the human behavior and brain that I don't see any reason to think their thought are significantly differents.

This was not obvious at Vygotsky's time , but have a look at this:

http://www.nytimes.com/2010/06/22/sc...himp.html?_r=1
http://www.cell.com/current-biology/...822(10)00459-8

When I see a young chimp peeing on the dead body of the one he attacked, I tend to think his subjective experience is quite similar to humans in the same situation. I may be wrong, but find the body of ethologic data that have been collected in the recent years quite convincing.

Quote Quote by ConradDJ View Post
These abilities are quite amazing in themselves, but if you talk to biologists or ethologists I doubt you’ll find many who believe there is a “hard problem” here. (...) It’s clear why these capacities evolved, it’s clear how evolution works, and there’s no mystery any more about the relationship of evolving self-reproducing organisms to the physical world.
*cough cough*. Well I don't want to argue or discuss it further, but I promise it's not the view of most of my collegues.
Lievo
#44
Feb13-11, 11:31 PM
P: 268
Quote Quote by ConradDJ View Post
Before anyone begins to ask questions about the relation of our conscious experience to that of an animal, I would recommend that they spend some time thinking about the very great differences that exist between the conscious experience of humans, say, in oral cultures and in literate culture. (Look up Walter Ong’s work or Eric Havelock’s, for example.) That helps give a sense of how radically different one’s “consciousness” and “sense of self” can be, even when the underlying language has hardly changed.
Interesting. That may qualify as a demand I ask here. If you could turn it into specific claim supported by specific evidence, that'd be interesting.
Lievo
#45
Feb13-11, 11:42 PM
P: 268
Quote Quote by apeiron View Post
identify the paper you believe supplies the argument.
Sure. How could I not answer such a polite demand? However, if you don't mind, I'll wait for you to start providing the experimental evidences supporting some of your past claims.

Quote Quote by apeiron View Post
Again you supply your personal and unsupported reaction against an actual referenced case
My personal reaction is the reaction of a scientitist who has worked specifically on this issue. So, to be perfectly clear, I know that at least regarding coma, your statement is bull.
apeiron
#46
Feb14-11, 02:08 AM
PF Gold
apeiron's Avatar
P: 2,432
Quote Quote by Lievo View Post
My personal reaction is the reaction of a scientitist who has worked specifically on this issue. So, to be perfectly clear, I know that at least regarding coma, your statement is bull.
As usual, references please. And here you seem to be saying that you can even cite your own research. So why be shy?
ConradDJ
#47
Feb14-11, 06:40 AM
PF Gold
P: 302
Quote Quote by JDStupi View Post
In principal, I agree that there exists a "hard problem of consciousness" for both humans and animals, because I do believe that animals have qualia and internal states...

Drastically different cultures which focus on drastically different aspects of human life will almost certainly have a completely different structuring of their internal perceptual manifold. This is relevant because, how could we hope to understand the nature of "what it is like to be a bat" from an internal perspective if we can barely understand what it is like to be in a completely different culture, or even what it is like to be the guy next door.

JD – Thanks, what you say makes sense to me. The point is not to stop people from trying to understand the mental life of chimps or bats, if for some reason they’re inclined to do that. But when people think about consciousness, I imagine it’s usually because they’re trying to understand their own – which is of course the only mentality they’ll ever actually experience. And it’s a very bad way to start by skipping over the many, many levels of attention-focusing techniques that we learn as we grow up into language – which is pretty much what human consciousness is made of – and defining the topic of study as some kind of generic “sense of self” that chimps have but maybe mollusks don’t.

There’s no absolute dividing-line between “conscious” and “non-conscious” – or “sentient” and “non-sentient” for that matter. These are highly relative terms. But I believe there are two huge leaps in our evolutionary history, with the origin of life and the origin of human communication. The problem is that we understand the first of these relatively well, and the second almost not at all.

Quote Quote by apeiron View Post
This is precisely what people who want to debate consciousness should demonstrate a grasp of - the rich structure of the mind...

Philosophy based on qualia-fication does the same thing. If you atomise, you no longer have a chance of seeing the complex organisation.

Yes. I do think the status of “qualia” is an interesting topic – but again, this is a highly relative term. As you well understand, it’s not as though there is something like a simple, “primitive” experience of “red” we can go back to and take as a pristine starting point for assembling a conscious mind.

Quote Quote by Lievo View Post
By the very same logic, you don't know that I or any one but you is conscious. You're making an analogy with yourself, and you may well be wrong to.

Of course I don't believe this solipsism, but it's exactly the same logic for chimps. So where do we stop? At this point, you would probably say: hey but humans are humans: we know that's the same kind of brain so for our species it's correct to reject solipsism.

No, you’re not getting my point. I’m not interested in proving that solipsism is incorrect – no sensible person needs to do that. I’m just as little interested in proving that chimps are like humans, or unlike humans, both being obviously true in various ways. How can there be a “correct” answer to the question, “Do chimps have an internal mental life like ours?”

My point is that we interpret the mentality of others by projecting, and that this is a primary human capacity. If we didn’t imagine each other as people, communication in the human sense would be inconceivable. It’s obvious to me that no one imagines anyone else’s internal world “correctly” – what would that even mean? But because we automatically imagine each other as conscious beings, we open up the possibility of talking with each other even about our private, internal experience, and it’s amazing how profound the experience of communication can seem, sometimes.

But because the projective imagination is so fundamental to us, if we want to understand anything at all about consciousness and its history, we have to focus on the differences. Unless we point to specific distinctions between different ways of being "conscious", we literally don't know what we're talking about.

For example – before Descartes, I think it’s reasonable to suppose that no one in the history of human thought had ever experienced the world as divided between an “external objective reality” and a “subjective inner consciousness”... because that difference had never before been conceptualized, never brought into human language. But today any thinking person probably experiences the world that way, because the subject/object split became basic to Western thought during the 17th century, and long ago percolated into popular language. So today it takes a real stretch of imagination to glimpse what “consciousness” was like in the Renaissance... and we can only hope to do that if we can focus on landmarks like Descartes’ Meditations and take them seriously as consciousness-changing events.

I expect that to be controversial – but that’s what I mean by “focusing on the differences” in consciousness.
Jimmy Snyder
#48
Feb14-11, 07:27 AM
P: 2,179
I was going to suggest the mirror test, but I thought it would be a good idea to try it out on myself first. Step 1 is to take a good look at yourself in the mirror. Then put a mark on your forehead and then look again in the mirror. If you don't pick at your forehead, that means either that you are not self-aware, or that you are a slob. If you do pick at your forehead it means that you are self-aware but not necessarily. First you have to consider whether you always pick at your forehead. I failed the test myself, but I'm told that chimps, dolphins, and elephants have passed it.
apeiron
#49
Feb14-11, 03:25 PM
PF Gold
apeiron's Avatar
P: 2,432
Quote Quote by ConradDJ View Post
Yes. I do think the status of “qualia” is an interesting topic – but again, this is a highly relative term. As you well understand, it’s not as though there is something like a simple, “primitive” experience of “red” we can go back to and take as a pristine starting point for assembling a conscious mind.
A lot of philosophy of mind is motivated by a naive atomistic definition of qualia. My general argument against this is the dynamicist's alternative. Qualia are like the whorls of turbulence that appear in a stream. Each whorl seems impressively like a distinct structure. But try to scoop the whorl out in a bucket and you soon find just how contextual that localised order was.

Plenty of people have of course reacted against the hegemony of atomistic qualia. For instance, here are two approaches that try to emphasise the contextual nature of seeing any colour.

On this new view of the origins of color vision, color is far from an arbitrary permutable labeling system. Our three-dimensional color space is steeped with links to emotions, moods, and physiological states, as well as potentially to behaviors. For example, purple regions within color space are not merely a perceptual mix of blue and red, but are also steeped in physiological, emotional and behavioral implications – in this case perhaps of a livid male ready to punch you.

http://changizi.wordpress.com/2010/1...-red-the-same/
Building on these insights, this project defines qualia to be salient chunks of human experience, which are experienced as unified wholes, having a definite, individual feeling tone. Hence the study of qualia is the study of the "chunking," or meaningful structuring, of human experience. An important, and seemingly new, finding is that qualia can have complex internal structure, and in fact, are hierarchically organized, i.e., they not only can have parts, but the parts can have parts, etc. The transitive law does not hold for this "part-of" relation. For example, a note is part of a phrase, and a phrase is part of a melody, and segments at each of these three levels are perceived as salient, unified wholes, and thus as qualia in the sense of the above definition - but a single note is not (usually) perceived as a salient part of the melody.

http://cseweb.ucsd.edu/~goguen/projs/qualia.html
And also these neo-whorfian experiments show how language actively plays a part in introspective awareness, supporting the general Vygotskian point too.

Boroditsky compared the ability of English speakers and Russian speakers to distinguish between shades of blue. She picked those languages because Russian does not have a single word for blue that covers all shades of what English speakers would call blue; rather, it has classifications for lighter blues and darker blues as different as the English words yellow and orange. Her hypothesis was that Russians thus pay closer attention to shades of blue than English speakers, who lump many more shades under one name and use more vague distinctions. The experiment confirmed her hypothesis. Russian speakers could distinguish between hues of blue faster if they were called by different names in Russian. English speakers showed no increased sensitivity for the same colors. This suggests, says Boroditsky, that Russian speakers have a "psychologically active perceptual boundary where English speakers do not."

http://www.stanfordalumni.org/news/m...oroditsky.html
But clearly there is a heck of a lot more to be said about the status of qualia. It does not seem too hard to argue their contextual nature. It is in fact an "easy problem". Harder is then saying something about where this leaves the representational nature of a "contextualised qualia".

Even as a whorl in the stream, a qualitative feeling of redness is "standing for something" in a localising fashion. It is meaningfully picking out a spot in a space of possible reactions. So we have to decide whether, using the lens of Peircean semiotics, we are dealing with a sign that is iconic, indexical or symbolic. Or using Pattee's hierarchy theory distinctions, does qualiahood hinge on the rate dependent/rate independent epistemic cut?

In other words, we cannot really deal satisfactorily with qualia just in information theoretic terms. The very idea of representation (as a construction of localised bits that add up to make an image) is information based. And even if we apply a "contextual correction", pointing out the web of information that a qualia must be embedded in, we have not solved the issue in a deep way...because we have just surrounded one bit with a bunch of other bits.

What we need instead is a theory of meaning. Which is where systems approaches like semiotics and hierarchy theory come in.
apeiron
#50
Feb14-11, 05:55 PM
PF Gold
apeiron's Avatar
P: 2,432
Goguen is a good source for a dynamicist/systems take on qualia and representations.

http://cseweb.ucsd.edu/~goguen/projs/qualia.html

You can see how this is a battle between two world views, one based on bottom-up causal explanations (which start off seeming to do so well because they make the world so simple, then eventually smack into hard problems when it becomes obvious that something has gone missing), the other based on a systems causality where there is also a global level of top-down constraint, and so causality is a packaged deal involving the interactions between local bottom-up actions, and global top-down ones.

Anyway some snippets from a Goguen paper on musical qualia which illustrates what others are currently saying (and using all the same essential concepts that I employ such as hierarchy, semiosis, anticipation, socialisation, contextuality....).

http://cseweb.ucsd.edu/~goguen/pps/mq.pdf

Musical Platonism, score nominalism, cognitivism, and modernist approaches in general, allassume the primacy of representation, and hence all ounder for similar reasons. Context is crucial to interpretation, but it is determined as part of the process of interpretation, not independently or in advance of it. Certain elements are recognized as the context of what is being interpreted,while others become part of the emergent musical \object" itself, and still others are deemed irrelevant. Moreover, the elements involved and their status can change very rapidly. Thus, every performance is uniquely situated, for both performers and listeners, in what may be very differentways.

In particular, every performance is embodied, in the sense that very particular aspectsof each participant are deeply implicated in the processes of interpretation, potentially including their auditory capabilities, clothing, companions, musical skills, prior musical experiences, implicit social beliefs (e.g., that opera is high status, or that punk challenges mainstream values), spatiallocation, etc., and certainly not excluding their reasons for being there at all (this is consistent with the cultural historical approach of Lev Vygotsky.

Most scientific studies of art are problematic for similar reasons. In particular, the third person,objective perspective of science requires a stable collection of "objects" to be used as "data,"which therefore become decontextualized, with their situatedness, embodiment, and interactive social nature ignored.

This paper draws on data from both science and phenomenology, in a spirit similar to the "neurophenomenology" of Francisco Varela, as a way to reconcile first and third person perspectives,by allowing each to impose constraints upon the other. Such approaches acknowledge that the first and third person perspectives reveal two very different domains, neither of which can be reduced to the other, but they also deny that these domains are incompatible.

Whatever approach is taken, qualia are often considered to be atomic, i.e., non-reducible, or without constituent parts, in harmony with doctrines of logical positivism, e.g., as often attributed to Wittgenstein’s Tractatus. Though I have never seen it stated quite so baldly, the theory(but perhaps "belief" is a better term, since it is so often implicit) seems to be that qualia atoms are completely independent from elements of perception and cognition, but somehow combine with them to give molecules of experience.

Peirce was an American logician concerned with problems of meaning and reference, who concluded that these are relational rather than denotational, and whoalso made an in uential distinction among modes of reference, as symbolic, indexical, or iconic....A semiotic system or semiotic theory consists of: a signature, which gives names for sorts, subsorts, and operations; some axioms; a level ordering on sorts having a maximum element called the top sort; and a priority ordering on the constructors at each level....
Axioms are constraints on the possible signs of a system. Levels express the whole-part hierarchy of complex signs, whereas priorities express the relative importance of constructors and their arguments; social issues play an important role in determining these orderings. This approach has a rich mathematical foundation....

The Anticipatory Model captures aspects of Husserl’s phenomenology of time. For example,it has versions of both retention and protention, and the right kind of relationship between them. It also implies Husserl’s pithy observation that temporal objects (i.e., salient events or qualia) arecharacterized by both duration and unity. Since it is not useful to anticipate details very far into the future, because the number of choices grows very quickly, an implementation of protention, whether natural or artificial, needs a structure to accommodate multiple, relatively short projections, basedon what is now being heard, with weights that increase with elapsed time; this is a good candidatefor implementation by a neural net of competing Hebbian cell assemblies, in both the human and algorithmic instantiations, as well as robots (as in [71]), and it also avoids reliance on old style AI representation and planning.

This paper has attempted to explore the qualitative aspects of experience using music as data, and to place this exploration in the context of some relevant philosophical, cognitive scientific, and mathematical theories. Our observations have supported certain theories and challenged others. Among those supported are Husserl’s phenomenology of time, Vygotsky’s cultural-historical approach, and Meyer’s anticipatory approach, while Chalmers’ dualism, Brentano’s thesis on intentionality, qualia realism, qualia atomism, Hume’s pointilist time, and classical cognitivism have been disconfirrmed at least in part.

In particular, embodiment, emotion, and society are certainly important parts of how real humans can be living solutions to the symbol grounding problem. The pervasive influence of cognitivism is presumably one reason why qualia in general, and emotion in particular, have been so neglected by traditional philosophy of mind, AI, linguistics, and so on. We may hope that this is now beginning to change.

This suggests that all consciousness arises through sense-making processes involving anticipation, which produce qualia as sufficiently salient chunks. Let us call this the C Hypothesis; it provides a theory for the origin and structure of consciousness. If correct, it would help to ex-plain why consciousness continues to seem so mysterious: it is because we have been looking at it through inappropriate, distorting lenses; for example, attempting to view qualia as objective facts, or to assign them some new ontological status, instead of seeing segmentation by saliency as an inevitable feature of our processes of enacted perception.
Lievo
#51
Feb14-11, 08:42 PM
P: 268
Quote Quote by ConradDJ View Post
I’m just as little interested in proving that chimps are like humans, or unlike humans, both being obviously true in various ways. How can there be a “correct” answer to the question, “Do chimps have an internal mental life like ours?”
Ok let me challenge your lack of interest in defining the topic of study as some kind of generic “sense of self” that chimps have but maybe mollusks don’t. You'll agre that chimps behave in a way we can't program yet, which is a good reason to believe we miss something below the human level. My guess is that when one will be able to program something that would behave as a chimp, then it'd be not too hard to go up to human. What I think may challenge your lack of interest, is that this last thought should be shared by anyone thinking of language and culture as of primary importance to human spirit. Don't you think?

Quote Quote by ConradDJ View Post
My point is that we interpret the mentality of others by projecting, and that this is a primary human capacity.
I won't argue your line of thinking, but the premice is just not supported by the current evidences.

http://www.sciencemag.org/content/31.../1967.abstract

Quote Quote by ConradDJ View Post
For example – before Descartes, I think it’s reasonable to suppose that no one in the history of human thought had ever experienced the world as divided between an “external objective reality” and a “subjective inner consciousness”... because that difference had never before been conceptualized, never brought into human language.
http://en.wikipedia.org/wiki/Dualism...sophy_of_mind)

Ideas on mind/body dualism are presented in Hebrew Scripture (as early as Genesis 2:7) where the Creator is said to have formed the first human a living, psycho-physical fusion of mind and body--a holisitic dualism. Mind/body dualism is also seen in the writings of Zarathushtra. Plato and Aristotle deal with speculations as to the existence of an incorporeal soul that bore the faculties of intelligence and wisdom. They maintained, for different reasons, that people's "intelligence" (a faculty of the mind or soul) could not be identified with, or explained in terms of, their physical body.[2][3]
Pythagorean
#52
Feb14-11, 09:25 PM
PF Gold
Pythagorean's Avatar
P: 4,287
Quote Quote by apeiron
Qualia are like the whorls of turbulence that appear in a stream. Each whorl seems impressively like a distinct structure. But try to scoop the whorl out in a bucket and you soon find just how contextual that localised order was.
apeiron, you might be interested in this paper by Rabinovich that references experiments by Laurent and Friedrich:

Experimental observations in the olfactory
systems of locust (7) and zebrafish (8) support
such an alternative framework. Odors generate
distributed (in time and space), odor- and concentration-specific patterns of activity in principal neurons. Hence, odor representations can
be described as successions of states, or trajectories, that each correspond to one stimulus and
one concentration (9). Only when a stimulus is
sustained does its corresponding trajectory
reach a stable fixed-point attractor (10).
However, stable transients are observed
whether a stimulus is sustained or not—that is,
even when a stimulus is sufficiently short-lived
that no fixed-point attractor state is reached.
When the responses to several stimuli are compared, the distances between the trajectories
corresponding to each stimulus are greatest
during the transients, not between the fixed
points (10).
Rabinovich, M., Huerta, R., Laurent, G. (2008) Transient dynamics for neural processing. Science 321, 5885
http://biocircuits.ucsd.edu/Papers/Rabinovich08.pdf

References in paper
7. G. Laurent, H. Davidowitz, Science 265, 1872 (1994).
8. R. Friedrich, G. Laurent, Science 291, 889 (2001).
ConradDJ
#53
Feb15-11, 05:22 AM
PF Gold
P: 302
Quote Quote by apeiron View Post
Goguen is a good source for a dynamicist/systems take on qualia and representations.

http://cseweb.ucsd.edu/~goguen/projs/qualia.html
Apeiron -- thanks very much for this link... I have a particular interest in how we perceive melodies, partly because it helps me think about the layered structure of time. I'll look at this and perhaps we'll discuss it in another thread.
ConradDJ
#54
Feb15-11, 06:24 AM
PF Gold
P: 302
Hi Lievo – responding to your #51 above...

When I said projective interpreting was a primary human capacity, I didn’t mean to imply that it’s uniquely human – though of course in some ways that’s true. But I noted in post #38 above that “mind-reading” at a primitive level is something we share with other mammals. I would say that all our deep emotional sensitivities are essentially mammalian. What’s uniquely human about us are the very complex ways these sensitivities evolve, in the relationships we develop by talking with each other.

To return to the topic of “consciousness” and the “sense of self” — my main point is that there is no absolute dividing-line here. I suggest we think of consciousness as the sum of all the ways we have of noticing things and paying attention to them – not only to things, but especially to each other, and to ourselves. What gives us humans a “sense of self” is that we have more or less highly developed relationships with ourselves, that we conduct through the same means we use to relate to other people... again, mostly through talking. Of course when we talk to ourselves we get to skip a lot of the articulation needed to make ourselves clear to others... but language is still there in the background as what apeiron calls the scaffolding for thought.

“Consciousness” in this sense is not a single capacity we either do or don’t share with other animals. Mice surely see and feel... whatever it is they see and feel. But what gives us humans a highly-developed sense of “living inside our own minds” is that we talk to ourselves a lot... especially today.

If you read Homer, you’ll also find Achilles talking to himself – except that it’s not described that way. He talks to his thymos... which is apparently some bodily organ. Homeric heroes never just sit there and “think” – before philosophy was invented (or its counterparts in some other cultures), human language had no way to recognize such a process.

So again, there are many, many stages we humans have evolved through, in getting to “have a sense of self.” You’re right that the mind/body distinction is ancient – but you’ll find that right through Medieval times the mind is almost never described “from inside” – it’s described as a thing, a kind of object. Augustine’s Confessions come closest to a modern “subjective” view, in his reflections on memory (not present-time consciousness). But even that is almost unique in ancient literature.

The breakthrough with Descartes is not in his “theory” about mind vs. body, but in his taking such a radically subjective view, even questioning whether the objective world is “really there.” It’s only in the latter part of the 17th century that people began to get this point of view and see the world “from inside their own heads”, so to speak. It’s only at that point that “the mind” began to be conceived as “subjective consciousness”.

What’s really interesting about this, to me, is that each one of us personally has evolved through many stages of learning to be self-aware, as we grow up. There is perhaps no ultimate end-point to this evolution... but there is a point at which we typically imagine we’ve come to the end, with “adulthood” (from Latin for “at the end”). At that point we seem to fall under an illusion which can be hard to dispel, that our way of seeing things is just “being conscious” in some neutral, absolute sense. And then we start wondering whether mice or monkeys are also “conscious”, etc.


Register to reply

Related Discussions
Real Vector Spaces and the Real Spectral Theorem Calculus & Beyond Homework 1
Is an irrational root of a real number imaginary or real? General Math 9
Open and closed in the geometrical sense vs the thermodynamic sense Special & General Relativity 6
Real analysis differentiation of a real function defined by a matrix Calculus & Beyond Homework 6
Is common sense good sense? General Discussion 10