Are there things language cannot describe?


by Nile3
Tags: language, things
Nile3
Nile3 is offline
#1
Jan19-12, 11:42 AM
P: 42
We know that certain chain of elements can describe very complex logical patterns and those same elements can describe illogical things to a given interpreter, but the question at hand is: Are there logical elements such that no logical language can describe them? How would we prove or disprove this?
Phys.Org News Partner Science news on Phys.org
Lemurs match scent of a friend to sound of her voice
Repeated self-healing now possible in composite materials
'Heartbleed' fix may slow Web performance
SteveL27
SteveL27 is offline
#2
Jan19-12, 12:45 PM
P: 799
Quote Quote by Nile3 View Post
We know that certain chain of elements can describe very complex logical patterns and those same elements can describe illogical things to a given interpreter, but the question at hand is: Are there logical elements such that no logical language can describe them? How would we prove or disprove this?
Not sure what you mean by a "logical element." But it's mathematically provable that there are real numbers (or alternatively, binary sequences) that can not be described by any language consisting of finite-length strings over a countable alphabet.

There are uncountably many binary strings (infinite strings consisting of 1's and 0's) but only countably many finite-length strings over a countable alphabet.

So there are real numbers that cannot possibly be named by language; and that can not be arbitrarily approximated by any algorithm or Turing machine.

See http://en.wikipedia.org/wiki/Computable_number

So in fact "almost all" numbers are not computable or nameable, in the sense that the computable numbers are a relatively tiny countable set among a vast uncountable sea of all real numbers.

[Real numbers and binary sequences are essentially the same thing, which is why I freely went back and forth between them.]

To clarify with a couple of examples:

* sqrt(2) is an irrational number. But you can see that I've uniquely characterized it with only 7 symbols. And there are well-known algorithms to calculate sqrt(2) to any desired degree of precision. So sqrt(2) is computable.

* Pi is not only irrational, it's also transcendental. (transcendental means it's not the root of any polynomial with rational coefficients). But pi can be uniquely characterized by many well-known algorithms and formulas that can be described with finitely many symbols. So pi is computable.

Hope this sheds some light on your question. There are lots of things, even in math, that we regard as having mathematical existence, but that we can not possibly name or uniquely characterize.

I would add that this question is much more interesting if we apply it to the world in general, and not just math. Are there things in the world that we can't name? Or do things in the world only exist by virtue of being nameable? This is philosophy, but it's interesting.
chiro
chiro is offline
#3
Jan19-12, 10:04 PM
P: 4,570
SteveL27, I'd be very cautious about not being able to compute any arbitrary number.

The methods of computation we have currently are extremely primitive. Add to this that our ability to categorize things like numbers or function in any kind of analytic function is also primitive. In terms of analytic primitives, we don't have many, and I think its a little short sighted to say that such things can't be developed in the future.

To add to that, think of how limited we are as human beings in finding patterns in anything. The truth is that we are hopeless when it comes to randomness, and this shows in human behaviour if you ask them to do random things.

Our algorithms are really, really primitive and combined with the fact that we are hopeless of making sense of anything that is 'random' should be an indicator to not make overly inductive statements like this.

pwsnafu
pwsnafu is offline
#4
Jan19-12, 11:10 PM
Sci Advisor
P: 773

Are there things language cannot describe?


Quote Quote by chiro View Post
SteveL27, I'd be very cautious about not being able to compute any arbitrary number.

<snip>

Our algorithms are really, really primitive and combined with the fact that we are hopeless of making sense of anything that is 'random' should be an indicator to not make overly inductive statements like this.
Err, no.

The cardinality of the definable numbers is countable. How can you compute a number, and have it still not definable?
chiro
chiro is offline
#5
Jan19-12, 11:30 PM
P: 4,570
Quote Quote by pwsnafu View Post
Err, no.

The cardinality of the definable numbers is countable. How can you compute a number, and have it still not definable?
Can you link me to a result or a definition of what 'definable' is please?
SteveL27
SteveL27 is offline
#6
Jan20-12, 12:18 AM
P: 799
Quote Quote by chiro View Post
Can you link me to a result or a definition of what 'definable' is please?
http://en.wikipedia.org/wiki/Definable_real_number

However I didn't mention these originally. I only mentioned the computable numbers. These are both countable sets but they are not identical.

In any event, it's clear enough that there are only countably many finite-length strings over a countable alphabet. And that there are uncountably many reals. So there are uncountably many reals that have no possible algorithmic or finite-length description of any kind.

Now, it is possible that in the future, people will no longer care. Perhaps 100 years from now people will give up on set theory and base mathematics on algorithmic information theory. In that case, people will come to regard uncountable sets as an artifact of 20th century set theory that's weird but ultimately irrelevant.

In other words, our current understanding of uncountable sets would not become false, but it is possible that it might become unfashionable.

Of course if you throw out uncountable sets you lose the completeness of the reals ... but there are even people working on that, in the field of "constructive analysis." These kinds of explorations are currently off to the side of mainstream math ... but who's to say they won't be seen as fundamental in the future?

To sum up: in current math, there are undefinable numbers. In the future, perhaps mathematicians will become more influenced by the theory of computation, and will no longer care. That's the only way I can interpret what you mean by saying in the future, what I said might not be true. It would still be true, it's just that mathematicians would no longer care.

Do you know Chaitin? He has defined a number that is definable but not computable.

http://en.wikipedia.org/wiki/Gregory_Chaitin
chiro
chiro is offline
#7
Jan20-12, 01:32 AM
P: 4,570
Hey SteveL27.

I was going to suggest some of the things you said in the above post, but you've mentioned so I won't repeat it.

Having said though, it is impossible to compute pi exactly using whole numbers and arithmetric and I imagine most numbers are like that so in terms of computability absolutely agree.

The only thing I disagree is having a language to describe it that also describes its computability. The example I used is pi which is expressed as an infinite series, and as a speculative guess, I imagine math in the distant future being able to project highly chaotic and random information to a minimal basis of some sort, to get a compact form of an infinite series for some number. Its speculation, but its still an IMO.

For the specifics, think of an infinite series in terms of generic real numbers. At the moment the number of these is pretty minimal. We use one form to compute pi, but I imagine in the distant future, there will be some kind of framework to find some function of some sort that is used in an infinite series to calculate said representation.

Since it is an infinite series, there is enough information to represent uncountable numbers, but in this case we are not talking about turing machines or anything finite.

The big thing is however figuring out the right function. The functions we have right now are really really simple. The functions with infinite series expansions are even more minimal (like e^(x), trig functions, hyperbolic and so on).

So I was wrong about computation if we are talking about using algorithms that are based purely on arithmetic with whole numbers and I concede that.

In saying that, if we end up with computing devices that work on other structures that are not integers with some other computational framework, then this will probably not hold in this context.

I am also aware of Chaitin. He has does some pretty amazing work.
pwsnafu
pwsnafu is offline
#8
Jan20-12, 02:14 AM
Sci Advisor
P: 773
Quote Quote by chiro View Post
In saying that, if we end up with computing devices that work on other structures that are not integers with some other computational framework, then this will probably not hold in this context.
We definitely need to go beyond digital computation. I guess quantum turning machines would be the only model which would get anywhere near what you are proposing.
Stephen Tashi
Stephen Tashi is offline
#9
Jan20-12, 10:16 AM
Sci Advisor
P: 3,167
Let's see - question for a formal language: Describe something that this languge cannot describe?


Register to reply

Related Discussions
there are only two primary languages: natural language and mathematical language General Discussion 17
Does every language used in mathematics can be translated to first-order language? Set Theory, Logic, Probability, Statistics 7
fundamental things, emergent things General Discussion 47
Are all things in the universe caused by things that cause things like themselves? General Discussion 10
Things we can describe with mathematics General Math 6