Are semantic representations innate?

In summary: I like" or "I don't like" Basically a boolean state. But from the point of view of physical interactions I think of this in terms of "constructive" and "desctructive", referring to the self. I see. So from your perspective, semantics is not just about meaning, but about the relationship between a thing and its "constructive" and "destructive" aspects.
  • #36
zombie? I get the idea but I'm not entirely sure what your point was so forgive me if I am off. but consider this:

atoms arent conscious. so if I construct a human being out of atoms is the result a zombie without consciousness?
 
Physics news on Phys.org
  • #37
CaptainQuasar said:
To put it in your terminology, I wasn't saying that computers and animals do not know the "what", I question whether humans know the "why" any better than a computer or animal could, or if it's just that we usually have more complex "what"'s.

computers know 'how' to do things
animals know 'what' they are doing
humans know 'why' they are doing it.

a young human who doesn't yet know 'why' is in a sense still animal-like. a talking animal thanks to our language module (read 'the language instinct').

if a computer learned to know 'what' it was doing then it would be an animal.
af an animal learned to know 'why' it was doing 'what' it was doing than it would be human. and it would be able to speak.

that is my opinion, FWIW.
 
Last edited:
  • #38
granpa said:
zombie? I get the idea but I'm not entirely sure what your point was so forgive me if I am off. but consider this:

atoms arent conscious. so if I construct a human being out of atoms is the result a zombie without consciousness?

Yeah, that's a very good point. It definitely seems like one of the first questions that the discussion of consciousness leads to.

The consciousness stuff wasn't directed at you at all, I was responding to Q_Goest's points that brought it into the discussion of concepts. I don't think it's really relevant, myself: I think an animal or a computer could have a concept without having consciousness, by actively constructing the mental models that I think are related to concepts.

What I was saying in the "what" versus "why" is, if the "what" is a mental model used for predictive purposes, what really distinguishes the "why" from it? It seems to me that the "why" is simply an attempt to make an extended and more complicated mental model. An attempt that appears to sometimes be successful, as in the case of much of science, and sometimes appears to be unsuccessful, as in the cases of when people attribute causes to supernatural forces.

Might cats not have some notion similar to supernatural forces out at the ends of the loose threads of their mental models? It seems to me that they might, and that this might qualify as a "why" in the possession of an animal.
 
  • #39
Rather than go off course here, I’d be interested in your thoughts on post #31 where I try to define meaning. I don’t mind going off on a tangent for a bit to help explain some basic concepts, but would really appreciate if the focus of the thread remained on the idea of meaning and semantic representation.
CaptainQuasar said:
Yes, I know what a p-zombie is, I just wouldn't use obscure effete terms like that on the basis that I might sound like a foppish dandy. :biggrin: And not to mention, other participants in the conversation might be unfamiliar with them.
I agree, and I wrestle with the same issue. When does the professional terminology get in the way of understanding an issue? However, as difficult as it is sometimes to understand, the terms are intended to aid in conveying a concept, so I try to define them when I don’t think they will be easily understood.

CaptainQuasar said:
This is exactly what I meant about not being empirical, if you're going to simply assume that you know the presence of consciousness is due to a physical substrate present in humans but not in lower animals.
There’s something called the “supervenience thesis” which I think we should take as an axiom for the purposes of this thread. That thesis simply states there is a physical substrate that supports the phenomenon of consciousness, and any similar physical substrate should therefore also support consciousness. I don’t mind arguing this point, but let’s do that in another thread if you don’t mind.

CaptainQuasar said:
I'm not confusing them, I said I thought a definition of "concept" ought to be something that could be examined empirically. But go ahead, school me in what granpa meant by those terms.
When you say, “.. I wasn't saying that computers and animals do not know the "what", I question whether humans know the "why" any better than a computer or animal could, or if it's just that we usually have more complex "what"'s.”
This might be misconstrued to imply that both computers and animals have some kind experience when they undergo physical changes in state. Computationalism recognizes there are physical changes of state and it’s the function of those changes (ie: functionalism) which gives rise to the phenomenon.

Computationalism does not say that ALL computers, or any computational physical system for that matter, will posess this phenomenon of consciousness. Note also that computationalism doesn’t say that computers are performing mathematical computations. That’s not what computers do – they don’t do math. They are only interpretable as doing math by a conscious person. Computationalism says that the interactions within some physical system can be simulated or symbolized in some way by using mathematics. There’s a big difference here. Again, if there’s some misunderstanding about this, perhaps we can start a new thread on what computationalism really means.

What I’d be very interested in is your thoughts on post #31. Any help there would be appreciated.
 
  • #40
we know 'what' a mammal is. its an animal that has fur, warm blood, live young, legs directly below the body, and many other characteristics. but 'why' is a mammal 'what' a mmmal is? why are they so different from reptiles?

this is where the difference is.
 
  • #41
you want to know the meaning of meaning?

what is the meaning of 'mammal'?
 
Last edited:
  • #42
granpa said:
atoms arent conscious.



I wouldn't go all the way to claiming this was true. Your consciousness resides in your brain which is made up of atoms. We have no way currently to verify your claim. It seems logical from your POV, but there is a chance that it can be wrong. This "emergent property" thing signifies nothing, it's an empty label made to fill up the great void in our understanding of consciousness. IMO there are 2 options that explain consciousness - you either believe in the thing that's not allowed to be talked about on science forums, or you believe in elementary particles that have a mind of their own and are able to construct a universe and wonderful beings. Pure uncaused randomness leading to energy turning into a universe that existed for 14 000 0000 0000 years governed by a set of laws of physics of a very unknown origin, that could harbour conscious life, is utter nonsense IMO.
 
Last edited:
  • #43
granpa said:
we know 'what' a mammal is. its an animal that has fur, warm blood, live young, legs directly below the body, and many other characteristics. but 'why' is a mammal 'what' a mmmal is? why are they so different from reptiles?

this is where the difference is.

Isn't the reason that a mammal is so different from a reptile because "mammal" and "reptile" are categories that were intentionally created by scientists to sort things with different characteristics into? That one, it seems to me, is definitely tied to semantics, and the "why" of the specific words would be tied into the linguistic history of English. But as for a more fundamental "why" if that's what you're asking, human models for why a variety of animals exist have ranged from the action of some creator god to modern science of evolution.

But is a concept that involves, say, a bear looking at a squirrel and being able to recognize "that thing probably came from the trees" or looking at a bird and thinking "that thing probably came from the sky" - is a human having the same concept plus an origin story involving either gods or scientifically-defined processes really materially different from what the bear thinks?
 
  • #44
CaptainQuasar said:
Isn't the reason that a mammal is so different from a reptile because "mammal" and "reptile" are categories that were intentionally created by scientists to sort things with different characteristics into?

one does not follow from the other. if categorise are random creations of peoples minds then one would expect that characteristics would be random. one would expect a continuum of different animals.
 
  • #45
Responding to #31:

Q_Goest said:
This experience of the world then, is more than a mental representation. We might imagine a mental representation of a lion attacking us without any bodily sensation of fear or panic. I don’t think there’s anything inconsistent about that.

In my framing this would be the mental model of a lion being used to either predict an occurrence involving either you yourself, or predict an occurrence not involving yourself. I would call fear or panic additional mental processes that serve to deal with anticipated occurrences involving yourself.

Q_Goest said:
So if we break up the experience we have of the world into a mental representation of the world (which arises through the unification of our senses) and the meaning of this mental representation, then that would seem to point to the unified experience containing meaning only after we associate other bodily sensations to this unified experience.

This doesn't follow from the other things you've said, in my opinion. Mental representations of the world don't simply arise from a unification of our senses. Someone who is blind or deaf, for example, can have a very similar mental representation of the world, granting the same predictive capabilities, compared to someone who has all senses functional.

Lots of things feed into these mental representations that aren't simply sensory information - "communication", which I put in quotes because I'm talking about pieces of mental models that might come from other people, or something mentally symbolized via input from a computer or other inanimate object like a divination or augury - pigeon guts for instance, or even from a communication error - you might develop an idea, an addition to an existing mental representation, because you mis-heard something someone said.

Other things that would be building blocks of mental representations which don't derive from the senses would be things like logic or mathematics. And of course, as you yourself mention, things like panic or fear that appear to be the product of special mental processes or brain structures.
 
Last edited:
  • #46
granpa said:
one does not follow from the other. if categorise are random creations of peoples minds then one would expect that characteristics would be random. one would expect a continuum of different animals.

If you believe in evolution or at least the fossil evidence it's partially derived from, there is a continuum of different animals, they just aren't all alive today. And there is certainly a continuum of characteristics within a particular species and there are organisms that don't fit into existing scientific categories. In botany, for example, just a few decades ago they had to tear everything apart and re-categorize it based upon genetic information and evolutionary theory.
 
  • #47
instead of asking what is 'meaning' maybe we should ask what is 'information'?
 
  • #48
Yes, good question. Perhaps another is, "is information a form of communication?" I don't know the answer to that.
 
  • #49
CaptainQuasar said:
If you believe in evolution or at least the fossil evidence it's partially derived from, there is a continuum of different animals, they just aren't all alive today. And there is certainly a continuum of characteristics within a particular species and there are organisms that don't fit into existing scientific categories. In botany, for example, just a few decades ago they had to tear everything apart and re-categorize it based upon genetic information and evolutionary theory.

of course there WAS a continuum but there isn't today. why? obviously many died. why? we arent talking about traits within a species. we are talking about the difference between different species.

as for recategorizing according to genetics, what if you had a car that was a ford and another almost identical one that was a chevy. are they not both cars? the fact that they came from 2 different sources is irrelevant. what about a green car and a blue car. are they not both cars? so classifying animals according to genetics may be useful for biologists but it isn't really a proper classification.
 
Last edited:
  • #50
granpa said:
instead of asking what is 'meaning' maybe we should ask what is 'information'?


The nature of our existence? The ability to find meaning in the sequence of elementary particles arranged in certain ways? How else would we make sense in a world ruled by QFT?
Music is a good representation of how we perceive information and reality. We are able to extract patterns of sound waves that make sense to us out of noise(all the possible sound waves).
 
  • #51
information=pattern?
 
  • #52
granpa said:
of course there WAS a continuum but there isn't today. why? obviously many died. why? we arent talking about traits within a species. we are talking about the difference between different species.

as for recategorizing according to genetics, what if you had a car that was a ford and another almost identical one that was a ford. are they not both cars? the fact that they came from 2 different sources is irrelevant. what about a green car and a blue car. are they not both cars? so classifying animals according to genetics may be useful for biologists but it isn't really a proper classification.

It seems like you're saying that know that something is different is equivalent to knowing why something is different and that doesn't seem so to me. It just doesn't seem to me that even a complicated explanation involving scientific principles is a different order of thing, mentally, from a bear thinking "that bird probably came from the sky."
 
  • #53
granpa said:
information=pattern?


In music yes. How else would you discern music from noise? Does "rhythm" ring a bell?
 
  • #54
granpa said:
information=pattern?

Oh, if you're talking information theory, yeah. But in that context it has nothing to do with concepts or minds at all, that sort of information exists without any mind to perceive it.
 
  • #55
CaptainQuasar said:
Oh, if you're talking information theory, yeah. But in that context it has nothing to do with concepts or minds at all, that sort of information exists without any mind to perceive it.


Exactly, how would one relay information if it were not through specific patterns?
 
  • #56
information is what is communicated


meaning=information perceived by a conscious mind?
 
  • #57
granpa said:
information is what is communicated


meaning=information perceived by a conscious mind?


Fair enough. If you were at te quantum level would you be able to make sense of the 100 000 atoms comprising the HIV virus and perceive it as an information carrier? Another reason why i believe a theory of everything will have to account for consciousness.
 
Last edited:
  • #58
granpa said:
information is what is communicated


meaning=information perceived by a conscious mind?

That seems like a good place to start but I should think there would be many examples of information that does not derive from communication.
 
  • #59
CaptainQuasar said:
That seems like a good place to start but I should think there would be many examples of information that does not derive from communication.

it was asked if anformation=communication. I stated that information IS what is communicated. I dodnt imply that only what is communicated is information.
 
  • #60
WaveJumper said:
Fair enough. If you were at te quantum level would you be able to make sense of the 100 000 atoms comprising the HIV virus and perceive it as an information carrier? Another reason why i believe a theory of everything will have to account for consciousness.

not everything is knowable.
 
  • #61
granpa said:
it was asked if anformation=communication. I stated that information IS what is communicated. I dodnt imply that only what is communicated is information.

Oh, gotcha.
 
  • #62
CaptainQuasar said:
This doesn't follow from the other things you've said, in my opinion. Mental representations of the world don't simply arise from a unification of our senses. Someone who is blind or deaf, for example, can have a very similar mental representation of the world, granting the same predictive capabilities, compared to someone who has all senses functional.

Lots of things feed into these mental representations that aren't simply sensory information - "communication", which I put in quotes because I'm talking about pieces of mental models that might come from other people, or something mentally symbolized via input from a computer or other inanimate object like a divination or augury - pigeon guts for instance, or even from a communication error - you might develop an idea, an addition to an existing mental representation, because you mis-heard something someone said.

Other things that would be building blocks of mental representations which don't derive from the senses would be things like logic or mathematics. And of course, as you yourself mention, things like panic or fear that appear to be the product of special mental processes or brain structures.
I don’t disagree. What I’d like to do is come up with an explanation of how meaning can arise in the mind and what are the specific contents of meaning.

I think part of the key is to recognize that the contents of our mind, our sensations, emotions, mental imagery, etc… all the things listed as being “experiences” by Chalmers (see post #17) are everything that our mind has access to, so these experiences are everything that can be used to create ‘meaning’. These experiences we have are the only building blocks that can be used by the mind to create meaning. Further, these building blocks then have to be manipulated in some way by the mind. Perhaps the mind uses an 'equivalence function' which equates a set of experiences to experiences we may have in memory for example.

I think meaning begins with a mental representation of an object, situation, action, etc… What do these mental representations contain? I’d say that they contain the experience of ‘mental imagery’ produced by the senses. These senses undergo processing in the brain so we can have this unified mental image of something.

For example, I seem to remember hearing about some research that showed that even for blind people, the neurons in the brain which were used to create a mental (visual) image were still active and able to create mental imagery or a representation of what would be seen. This ‘structure’ of the world around the blind individual was created through other sensory inputs.

My point here is that things like ‘logic or mathematics’ (as these things are represented in the mind) are based solely on experiences we have, as strange as that may sound.

Another key to understanding meaning is that we learn to associate certain experiences with other experiences as Math points out. I’m thinking that perhaps there are certain sets of experiences that are associated with other sets of experiences (as listed by Chalmers) that result in what we call meaning. But all these sets of experiences have to start with the mental imagery created by our five senses and perhaps the bodily experiences we have as well.

Capt Quasar said, "Lots of things feed into these mental representations that aren't simply sensory information - "communication", which I put in quotes because I'm talking about pieces of mental models that might come from other people.." but again, in order for these communications to form any kind of meaning within our mind, we have to have sensory experiences of the person talking to us, and to get any real meaning out of that experience, we have to relate what that person is talking about to experiences we had ourselves. Otherwise, the communication can't create meaning. So even communications from others only create mental representations or meaning by using experiences (ie: those types of experiences listed by Chalmers) as building blocks.

On the other hand, it may be that our understanding (or the meaning) of things is such a complex mix of these different experiences, that we can't separate out one group of experiences from another. But I don't see anyone trying to tackle this issue from a philosophical perspective. Certainly the linguists and psychologists define meaning in various ways, but those concepts of meaning don't say anything about how they can arise in the mind. They are more like definitions of things such as dictionary.com provides:

1. to have in mind as one's purpose or intention; intend: I meant to compliment you on your work.
2. to intend for a particular purpose, destination, etc.: They were meant for each other.
3. to intend to express or indicate: What do you mean by “liberal”?
4. to have as its sense or signification; signify: The word “freedom” means many things to many people.
5. to bring, cause, or produce as a result: This bonus means that we can take a trip to Florida.
6. to have (certain intentions) toward a person: He didn't mean you any harm.
7. to have the value of; assume the importance of: Money means everything to them. She means the world to him.
 
  • #63
Q_Goest said:
My point here is that things like ‘logic or mathematics’ (as these things are represented in the mind) are based solely on experiences we have, as strange as that may sound.

You are asserting that a mind without any sensory experience could not arrive at logic or mathematics? I do not believe that is so. This is a pretty well-worn argument with me; philosophy guys have tried to convince me of this too.

Claiming that everything in the mind derives exclusively from the senses just seems manifestly untrue to me, no matter how emphatically stated. It's also an assumption that is suspiciously easy to base other Platonic-style reasoning on.

Q_Goest said:
Capt Quasar said, "Lots of things feed into these mental representations that aren't simply sensory information - "communication", which I put in quotes because I'm talking about pieces of mental models that might come from other people.." but again, in order for these communications to form any kind of meaning within our mind, we have to have sensory experiences of the person talking to us, and to get any real meaning out of that experience, we have to relate what that person is talking about to experiences we had ourselves. Otherwise, the communication can't create meaning. So even communications from others only create mental representations or meaning by using experiences (ie: those types of experiences listed by Chalmers) as building blocks.

I again don't think this is true. The medium of communication is irrelevant to the message in practicality - you could experience the same sentence spoken, written in a book, or read via braille and if it's a well-constructed sentence it will convey its message the same way, no integral connection to the sensory experience is involved.

"...to get any real meaning out of that experience, we have to relate what that person is talking about to experiences we had ourselves. Otherwise, the communication can't create meaning." This is assuming your conclusions. We would have to do some really weird and difficult and probably unethical experiments to try to test whether or not this is true, but I simply don't think there's justification to declare that the entirety of all mental representations or meaning is derived exclusively through the senses.

It may well be the bit that isn't derived through the senses that is important somehow. So I think an a priori dismissal of it isn't kosher.
 
Last edited:
  • #64
Here's a good analogy: saying that the mental representations conveyed by communication are ultimately composed of sensory experience is like saying that the basic building block of history is paper because most history is printed in books.
 
  • #65
You are asserting that a mind without any sensory experience could not arrive at logic or mathematics? I do not believe that is so.
I don’t know.. are you suggesting that a brain that doesn’t experience sensory qualia could understand that 1+1=2? What then would such a brain use to form the concept of 1? Can you point to a reference that explains and supports your viewpoint?
Claiming that everything in the mind derives exclusively from the senses just seems manifestly untrue to me
That’s not what I’m saying. Referring to Chalmers' list which was reprinted in post #17, the other experiences would be #8. ‘other bodily experiences’, #11 ‘emotions’ (although these may be hard to differentiate) and also #12 ‘sense of self’ which covers quite a bit. That covers such things as anger, love, embarrassment, … all the various sensations we experience which can’t be attributed to the sensory experiences. Some of these bodily experiences, such as desire for sex, are at least partially a function of age. But without any sensory experiences (ie: 1 through 7) or other bodily experiences, there’s not much left. It’s this sum total of all experiences that constitutes ‘everything in the mind’ – not just sensory experience. So take a look at that list, and see if you can think of any other experiences a mind may possess.

The medium of communication is irrelevant to the message in practicality - you could experience the same sentence spoken, written in a book, or read via braille…
There’s no disagreement here. When I said ‘talking’ I meant it in the colloquial way (ie: communicating).

I simply don't think there's justification to declare that the entirety of all mental representations or meaning is derived exclusively through the senses.
I don’t claim meaning is derived exclusively through the senses. Examine post #17 where I’ve quoted Chalmers for the purposes of this discussion. That list of experiences is not intended to be all encompassing as I’ve noted there. It is a list however, intended to aid in defining what is meant by ‘experiences’. In addition, I mean that list to differentiate between qualia on the one side and things that may be a summation of other experiences. By “summation of other experiences” I mean the unified experience we have of those various qualia is what acts as the basis for meaning.

I think it would be very beneficial to read through Stevan Harnad’s paper, “The Symbol Grounding Problem” which in a sense, forms a basis for the views I'm proposing here. Here's his abstract:
ABSTRACT: There has been much discussion recently about the scope and limits of purely symbolic models of the mind and about the proper role of connectionism in cognitive modeling. This paper describes the "symbol grounding problem": How can the semantic interpretation of a formal symbol system be made intrinsic to the system, rather than just parasitic on the meanings in our heads? How can the meanings of the meaningless symbol tokens, manipulated solely on the basis of their (arbitrary) shapes, be grounded in anything but other meaningless symbols? The problem is analogous to trying to learn Chinese from a Chinese/Chinese dictionary alone. A candidate solution is sketched: Symbolic representations must be grounded bottom-up in nonsymbolic representations of two kinds: (1) "iconic representations" , which are analogs of the proximal sensory projections of distal objects and events, and (2) "categorical representations" , which are learned and innate feature-detectors that pick out the invariant features of object and event categories from their sensory projections. Elementary symbols are the names of these object and event categories, assigned on the basis of their (nonsymbolic) categorical representations. Higher-order (3) "symbolic representations" , grounded in these elementary symbols, consist of symbol strings describing category membership relations (e.g., "An X is a Y that is Z"). Connectionism is one natural candidate for the mechanism that learns the invariant features underlying categorical representations, thereby connecting names to the proximal projections of the distal objects they stand for. In this way connectionism can be seen as a complementary component in a hybrid nonsymbolic/symbolic model of the mind, rather than a rival to purely symbolic modeling. Such a hybrid model would not have an autonomous symbolic "module," however; the symbolic functions would emerge as an intrinsically "dedicated" symbol system as a consequence of the bottom-up grounding of categories' names in their sensory representations. Symbol manipulation would be governed not just by the arbitrary shapes of the symbol tokens, but by the nonarbitrary shapes of the icons and category invariants in which they are grounded.
Ref: http://users.ecs.soton.ac.uk/harnad/Papers/Harnad/harnad90.sgproblem.html
 
  • #66
Q_Goest said:
I don’t know.. are you suggesting that a brain that doesn’t experience sensory qualia could understand that 1+1=2? What then would such a brain use to form the concept of 1? Can you point to a reference that explains and supports your viewpoint?

No. So what? Pointing to someone riffing about their thoughts on this stuff who claims "a mind without sensory qualia cannot understand math" or "a mind without sensory qualia can definitely understand math" without any experimental evidence to back it would be a crappy argument from authority.

You want a way to get to 1+1=2? Conscious and unconscious. Count 'em. Two parts of a mind that might be perceived and counted. And there are probably more. And I mentioned logic too, of course - there is no a priori reason why a mind without sensory experience could not perceive true and false or things like the Principle of Noncontradiction or the Buddhist Tetralemma. Any assertion that this is not possible is pulling a rabbit out of a hat to support your position.

Q_Goest said:
That’s not what I’m saying...

It does appear to be exactly what you're saying, you're just calling it "qualia" or "experiences". Right below here you say, "It’s this sum total of all experiences that constitutes ‘everything in the mind’". That is what I am disagreeing with.

Q_Goest said:
Referring to Chalmers' list which was reprinted in post #17, the other experiences would be #8. ‘other bodily experiences’, #11 ‘emotions’ (although these may be hard to differentiate) and also #12 ‘sense of self’ which covers quite a bit. That covers such things as anger, love, embarrassment, … all the various sensations we experience which can’t be attributed to the sensory experiences. Some of these bodily experiences, such as desire for sex, are at least partially a function of age. But without any sensory experiences (ie: 1 through 7) or other bodily experiences, there’s not much left. It’s this sum total of all experiences that constitutes ‘everything in the mind’ – not just sensory experience. So take a look at that list, and see if you can think of any other experiences a mind may possess.

Why? My position is that mental representations of things are not made exclusively from experiences or qualia or whatever you want to call them; it doesn't matter whether or not we can come up with more than what's on that list.

Q_Goest said:
I think it would be very beneficial to read through Stevan Harnad’s paper, “The Symbol Grounding Problem” which in a sense, forms a basis for the views I'm proposing here. Here's his abstract:

It's certainly a nifty paper but it does not seem to address what we're discussing at the moment. The abstract does not appear to even contain the word "qualia".

Go ahead and say something like most of the things in the mind primarily derive from qualia or experience or whatever. I just don't agree that everything does.

And by the way, this Socratic Method thing where you keep trying to send me off on little tasks does not impress me much.
 

Similar threads

  • General Discussion
3
Replies
71
Views
14K
Replies
14
Views
916
Replies
16
Views
3K
  • Beyond the Standard Models
Replies
3
Views
1K
  • General Discussion
Replies
16
Views
1K
  • Quantum Interpretations and Foundations
Replies
3
Views
1K
  • General Discussion
3
Replies
76
Views
13K
  • Set Theory, Logic, Probability, Statistics
Replies
23
Views
4K
  • General Discussion
15
Replies
500
Views
86K
  • Beyond the Standard Models
Replies
1
Views
2K
Back
Top