View Single Post
Dec28-12, 02:07 PM
PF Gold
Pythagorean's Avatar
P: 4,287
Firstly, recognize that consciousness is a big subject and often, different people have a different idea of what you're talking about when you use the word. You seem to refer to the subjective experience aspect of consciousness (that we have an experience associated with our behavior and actions).

It's unclear what neural structures may be required for subjective experience or if other structures can work too. For instance, plants have "brain-like" information processing organs[1] and in single-celled organisms have molecular networks that can be viewed as analogous to associative memory[2]. All this evidence does is show that limited intelligence can be demonstrated in other organisms. Depending on whether or not you associate the ability to have a subjective experience with intelligence, this kind of evidence may or may not be meaningful.

The problem is that, even with humans, we can only infer other humans have consciousness from similar behavior; there's no rigorous evidence-based test for consciousness. We can test intelligence and look at information processing in organisms, but we'll probably never know whether (or how) other organisms experience things.

Excepting, of course, a solution to the so-called "hard problem of consciousness"[3]. I personally find that, at it's core, this is no different than, say, a 'hard problem of entropy'. I.e. we don't know what entropy is or why it should be; it just is. I don't expect "problems" like these to ever be solved.

However, we, everyday, solve pieces of the easy problem. The functional story of how neurons behave. And to some extent, we can associate our personal experiences with those functions (something called "neural correlates of consciousness") but there's no obvious way to carry that analogy to something like a plant, where no homologous brain structures occur.