Register to reply

What or where is our real sense of self?

by Metarepresent
Tags: consciousness, qualia, recursion, representation
Share this thread:
fuzzyfelt
#55
Feb15-11, 06:58 AM
PF Gold
fuzzyfelt's Avatar
P: 742
Welcome Metarepresent!

Quote Quote by Q_Goest View Post
Ramachanadran takes the fairly uncontroversial stance that the overall phenomenal experience (and the many parts of that experience) is what equates to the mental representation.

All this isn’t to say that our present approach to consciousness is without difficulties as Ramachandran points out in the video. Some very serious issues that pop out of this approach include “mental causation” and the problems with downward causation (ie: strong downward causation) and the knowledge paradox. So once we understand the basic concept of mental representation as presented for example by Ramachandran, we then need to move on to what issues this particular model of the human mind gives rise to and look for ways to resolve those issues.


I'd like to hear more about the problems and how they might be resolved. I wonder if this idea regarding processing could be related to the problems and solutions.


"...cognitive neuroscientists have increasingly come to view
grapheme recognition as a process of hierarchical feature analysis (see
Grainger et al., 2008 and Dehaene et al., 2005 for reviews). As in the
original Pandemonium model (Selfridge, 1959), hierarchical feature
models posit a series of increasingly complex visual feature
representations and describe grapheme recognition as resulting
from the propagation of activation through this hierarchical network.
In the initial stages of letter processing, visual input activates
component features of the letter (line segments, curves, etc.) and
results in the partial activation of letters containing some or all of the
component features. Grapheme identification occurs over time via a
competitive activation process involving some combination of
excitatory and inhibitory connections both within the grapheme
level and between the grapheme level and other representational
levels, both bottom–up and top–down.
This Pandemonium model of letter perception is supported by a
wealth of studies on letter recognition, indicating that the number of
component features shared by a pair of letters predicts the likelihood
of those letters being confused (Geyer and DeWald, 1973). Integrating
these behavioral measures with the neuro-anatomical models of
visual perception, careful examination of the brain response to
pseudo-letters (non-letter shapes visually matched to the component
features comprising real letters) as well as infrequent and frequent
letters shows a cascading hierarchy of processing within the PTGA,
proceeding from posterior to anterior regions (Vinckier et al., 2007).
Further, ERP studies of letter processing (e.g., comparing the brain
response to letters and pseudo-letters) suggest feature-level processing
occurs before 145 ms, and letter-level processes occur thereafter
(Rey et al., 2009)."

http://www.sciencedirect.com/science...c&searchtype=a

"Hierarchical organization
The ventral visual pathway is organized as a hierarchy of areas connected
by both feedforward and feedback pathways (see Figure I). From
posterior occipital to more anterior inferotemporal regions, the size of
the neurons’ receptive fields increases by a factor of 2–3. This is
accompanied by a systematic increase in the complexity of the neurons’
preferred features, from line segments to whole objects, and a
corresponding increase in invariance for illumination, size, or location."

http://www.ncbi.nlm.nih.gov/pubmed/15951224
apeiron
#56
Feb15-11, 01:28 PM
PF Gold
apeiron's Avatar
P: 2,432
Quote Quote by ConradDJ View Post
When I said projective interpreting was a primary human capacity, I didn’t mean to imply that it’s uniquely human – though of course in some ways that’s true.
Another way of putting it is that social animals would project the best they can, we just have much more going on internally to project. We have a running, socially-formed, sense of self as something extra to project on to others (and in animistic fashion, on to chimps, mice, trees and forces of nature).

The mice experiment (how do these things get through the ethics committees?) only shows at most mice being sensitised to their own pain, not mice sympathising for the pain of others. As I say, you do have to wonder about the empathetic abilities of experimenters that can inject the mice with cramp-inducers in the first place.

To return to the topic of “consciousness” and the “sense of self” — my main point is that there is no absolute dividing-line here.
The animal sense of self would be the completely subjective emboddied form. The point about humans is that we carry around in our heads a second "objective" view of ourselves - the view that society would have of our actions and our existence. Our every emboddied response or impulse is being run through that secondary layer of ideas that is socially evolved. We are made conscious of being an agent within a collective structure. That is the big irony. The more aware of our selves we are, the more carefully we can be to play a creative part of that larger state of being.

And this irony is also why Vygotsky (a marxist!) has been so strongly resisted within anglo-saxon culture. The anglo-saxon mythology about free individuals (enchained by the arbitrary conventions of their societies) virtually demands that the essence of a person is biological. If they cannot be free in the way of a soul, then genetic determinism is preferred to social.

If you read Homer, you’ll also find Achilles talking to himself – except that it’s not described that way. He talks to his thymos... which is apparently some bodily organ. Homeric heroes never just sit there and “think” – before philosophy was invented (or its counterparts in some other cultures), human language had no way to recognize such a process
.

You have read Jayne's curious success, the Bicameral Mind then.

A very good book tracing the development of the language of mind, and so concepts about what the mind is, is Kurt Danzinger's Naming the Mind.
http://www.kurtdanziger.com/Book%202.htm

What’s really interesting about this, to me, is that each one of us personally has evolved through many stages of learning to be self-aware, as we grow up.
What really shocked me when I first started out was how poor my introspective abilities actually were. It took years of practice (and good theories about what to expect to find) to really feel I was catching the detail of what actually took place.

It is the same as having a trained eye in any field of course. Someone who has never played football will just have a rushing impression of a bustle on the pitch. A good player will see an intricately organised tapestry.

Introspection is a learnt skill. And society only wants to teach us to the level which suits its purposes. Another irony of the story. To go further, you have to get actually "selfish" and study psychophysics, Vygotskian psychology, social history, etc.
apeiron
#57
Feb15-11, 01:39 PM
PF Gold
apeiron's Avatar
P: 2,432
Quote Quote by fuzzyfelt View Post
I'd like to hear more about the problems and how they might be resolved. I wonder if this idea regarding processing could be related to the problems and solutions.
There is no way of doing neuroscience without adopting a hierarchical view of processing. Or a belief in top-down action. Every cortical neuron has far more top-down feedback synapses than bottom-up input ones modulating its actions.

The debate really is over how to model this complexity. Some go towards computational models (such as Grossberg's ART). Others go towards dynamicist models - of the kind that Pythagorean cited.
Pythagorean
#58
Feb15-11, 10:20 PM
PF Gold
Pythagorean's Avatar
P: 4,293
Quote Quote by apeiron View Post
There is no way of doing neuroscience without adopting a hierarchical view of processing. Or a belief in top-down action. Every cortical neuron has far more top-down feedback synapses than bottom-up input ones modulating its actions.

The debate really is over how to model this complexity. Some go towards computational models (such as Grossberg's ART). Others go towards dynamicist models - of the kind that Pythagorean cited.
They're not fundamentally mutually exclusive. I can quite easily (in principle) model an ART network with dynamical biophysical neurons. The only real difference is I'd be replacing postdictive states (1-0) with a predictive dynamical system that describes the biophysical mechanisms. So now we have a spectrum of states from 0 to 1 that depend on stimuli (and the history of stimuli) to the system. And of course the complexity grows with the number of neurons, and you eventually require supercomputers (or lots of patience).
apeiron
#59
Feb15-11, 11:54 PM
PF Gold
apeiron's Avatar
P: 2,432
Quote Quote by Pythagorean View Post
They're not fundamentally mutually exclusive.
No of course. In fact that is what led me to a causal perspective that sees them as complementary extremes.

It seems obvious that both are involved, but then the next thing is how do you model that in practice. The Laurent work you cited argues that dynamical networks can cycle linearly through a "channel" of metastable states. So you get a computational looking outcome - a liquid state machine.

Personally I don't think that the approach is significant. It may indeed turn out to be a level of mechanism that brain circuitry exploits. However I am talking about a more general accomodation between dynamism and computationalism. A model at a more abstract level.

This is why for example I emphasise Pattee's work on the epistemic cut, or Peircean semiotics. But I also agree that it is dynamism/computationalism across all levels. Or analog~digital, continuous~discrete, rate-dependent~rate-independent, or however one chooses to describe it.
Pythagorean
#60
Feb16-11, 12:49 AM
PF Gold
Pythagorean's Avatar
P: 4,293
Quote Quote by apeiron View Post
Personally I don't think that the approach is significant.
It may not be in the general context of qualia, but I think the experimental observations directly support a statement like:

Quote Quote by apeiron
Qualia are like the whorls of turbulence that appear in a stream. Each whorl seems impressively like a distinct structure. But try to scoop the whorl out in a bucket and you soon find just how contextual that localised order was.
It demonstrates how the appearance of transient events ("whorls of turbulence") in the "stream" of information can be correlated with perception events in the behavior of an organism (i.e. qualia presumably).
ConradDJ
#61
Feb16-11, 05:39 AM
PF Gold
P: 302
Quote Quote by apeiron View Post
You have read Jayne's curious success, the Bicameral Mind then.

Yes, along with Onian's Origins of European Thought, Snell's Discovery of the Mind, etc. -- relics of my academic years (which are by now also ancient history). But thank you for the reference to Danziger, I'll put it on my want-list.

I still like Jaynes' book, which has an interesting take on what we mean by "consciousness". He points out that we can do most everything we do without being conscious of it at all... like driving all the way through town to work, while thinking about something else, oblivious to what you're passing on the road, yet all the while maneuvering around potholes, etc. We only really need to be conscious when we're dealing with new and challenging situations, he suggests. And he goes on to ask how people might have dealt with such situations before there was the kind of developed “sense of self” that came with philosophy and the emergence of “self-reflective” internal dialogue.

This was a fine effort to imagine a really different kind of consciousness. The odd thing is that he’s dealing with the period of transition from oral to written culture, but missed the significance of that change. It's so difficult not to take for granted the basic tools we ourselves use in being conscious, like the ability to record spoken language.

Quote Quote by apeiron View Post
What really shocked me when I first started out was how poor my introspective abilities actually were. It took years of practice (and good theories about what to expect to find) to really feel I was catching the detail of what actually took place.

Yes, you’re right – and of course, all the deep layers of consciousness that we evolved when we were young have long been covered up by the more sophisticated and more verbal layers we’ve built on top of them. We have no memory at all of our earliest years, since the neural structures to support conscious recollection we still undeveloped.
ConradDJ
#62
Feb19-11, 05:54 AM
PF Gold
P: 302
Quote Quote by apeiron View Post
The animal sense of self would be the completely subjective emboddied form. The point about humans is that we carry around in our heads a second "objective" view of ourselves - the view that society would have of our actions and our existence. Our every emboddied response or impulse is being run through that secondary layer of ideas that is socially evolved.
Yes, I think this gets at what's essentially distinctive about human consciousness -- that we're operating mentally on two planes. The one plane is what we share with other animals, a highly evolved interactive connection to the world in present time. The other is a highly evolved projection of objective reality that we learn to construct and maintain in our heads as we learn to talk -- a projection that goes far beyond the "here and now", to include things that happened hundreds of years ago and things we imagine may happen far in the future, and things that may be going on now in distant countries, etc.

Human "language" is not merely a matter of words and grammar. In essence it's a software technology that has two primary functions -- (1) creating and maintaining this projection of the world from a standpoint "outside of time", as a perspective from which we experience and interpret our real-time interaction. And (2) communicating itself and its projections from one human brain to another, as its means of reproducing itself.

If we try to understand the differences between humans and other animals in terms of what we and they can or can't do, we can find primitive versions of most things human. Because we don't have clear definitions of "consciousness", "self", "introspection", "thought", "memory" or "language", etc. we can find ways to use all these terms for what animals do, if we want to.

But it seems to me the key is this software-layer of consciousness that has evolved by reproducing itself from brain to brain, and has evolved more and more complex ways of ensuring its own reproduction. You and I, as "conscious minds", are essentially run-time constructs of this software, running on the neural hardware of our brains.

So what's different between us and other animals isn't so much what we do as how we do it. What's unique about us is this software that gets installed in each of us in the first few years of our lives. At some point early on in human evolution, this symbiotic relationship became so vital to our biological survival that our brains and bodies began adapting very rapidly to support it -- including not only changes in the brain, but a great lengthening of the period in which human children are helplessly dependent on adults and deeply attached to them at an emotional level.
apeiron
#63
Feb19-11, 01:58 PM
PF Gold
apeiron's Avatar
P: 2,432
Quote Quote by ConradDJ View Post
Yes, I think this gets at what's essentially distinctive about human consciousness -- that we're operating mentally on two planes.
So when it comes to the OP, there are three levels of selfhood.

1) animal level is BEING a self.
2) human level is KNOWING you are BEING a self.
3) Vygotskean level is KNOWING that you KNOW you are BEING a self.

It is meta-metarepresentation.
fuzzyfelt
#64
Feb20-11, 04:42 AM
PF Gold
fuzzyfelt's Avatar
P: 742
Quote Quote by apeiron View Post
So when it comes to the OP, there are three levels of selfhood.

1) animal level is BEING a self.
2) human level is KNOWING you are BEING a self.
3) Vygotskean level is KNOWING that you KNOW you are BEING a self.

It is meta-metarepresentation.
So, to sum, I take it #2 is representing representations or manipulating representations of representations.

This is possible among many animals. It is possible among animals which may not communicate so specifically, e.g. a lion can weigh a visual representation of an amount of defenders against an aural representation of an amount of opponents, and will determine whether to act or not to act (Dehaene). Such abstractions are also performed by pre-verbal 4-6 month human infants (Dehaene, again). And then, it is demonstrated in the Richard Attenborough BBC doc, linked elsewhere, that monkeys who produce and hear specific calls representing specific threats which are confirmed by visual representations. Further, these representations may be manipulated for deception.

So, what is the actual distinction between # 2 and #3 in which language provides a uniquely human consciousness?
apeiron
#65
Feb20-11, 04:54 AM
PF Gold
apeiron's Avatar
P: 2,432
Quote Quote by fuzzyfelt View Post
So, what is the actual distinction between # 2 and #3 in which language provides a uniquely human consciousness?
That was a joke . Sorry if it was not obvious.

Though 3 would indeed be an example of socially constructed knowledge that takes individual self-awareness to a higher level.

BTW I would not use the term representation as it is a strictly computational concept - information as input that gets displayed. The brain does not work like that. (Cue the usual chorus....)
fuzzyfelt
#66
Feb20-11, 05:18 AM
PF Gold
fuzzyfelt's Avatar
P: 742
Yes, I was enjoying indulging in manipulations of metas, too . However, I don't see reason to take this last necessarily as a causal connection, if there is anything to these last ideas at all.
tessxyz
#67
Feb25-11, 12:27 PM
P: 1
Long but highly recommended.

http://www.youtube.com/watch?v=o4Z8CqAiYI8

http://www.youtube.com/watch?v=fbItR...eature=related
Tregg Smith
#68
Mar17-11, 10:09 PM
P: 48
Would a person with absolutely no memory have self?
Pythagorean
#69
Mar18-11, 12:28 AM
PF Gold
Pythagorean's Avatar
P: 4,293
Absolutely none? Of course not.

An amnesiac? Yes, but the self wouldn't be allowed to evolve due to hippocampal damage. The hippocampus "writes" to your neocortex (especially while you sleep).
apeiron
#70
Mar18-11, 06:58 PM
PF Gold
apeiron's Avatar
P: 2,432
Quote Quote by Tregg Smith View Post
Would a person with absolutely no memory have self?
You can't really have a brain and not have "memory" - a neural organisation with a specific developmental history.

But there is evidence for what happens when for example the hippocampus is destroyed and people can't form fresh autobiographical memories - store new experiences.

See the celebrated case of Clive Wearing.
http://en.wikipedia.org/wiki/Clive_Wearing
Lievo
#71
Mar18-11, 07:09 PM
P: 268
Quote Quote by Tregg Smith View Post
Would a person with absolutely no memory have self?
You may wish to read this. Likely the best pop book on neuropsychologia ever. It mentions among other things the case of anterograde amnesia that Pythagorean and Apeiron mentionned, and discuss your question specifically (argue for a yes).
Pythagorean
#72
Mar18-11, 07:11 PM
PF Gold
Pythagorean's Avatar
P: 4,293
And of course, amnesiacs can still learn implicit memories, since their basal ganglia are still in tact, though implicit memory are not conscious.

Parkinsons and Huntingtons patients, on the other hand, are associated with basal ganglia problems.

The two different kinds of damage are associated with different kinds of learning deficiencies, though amnesiacs are clearly the case for the more popular autobiographical memory.


Register to reply

Related Discussions
Real Vector Spaces and the Real Spectral Theorem Calculus & Beyond Homework 1
Is an irrational root of a real number imaginary or real? General Math 9
Open and closed in the geometrical sense vs the thermodynamic sense Special & General Relativity 6
Real analysis differentiation of a real function defined by a matrix Calculus & Beyond Homework 6
Is common sense good sense? General Discussion 10