# Logic & COmputation

1. Dec 29, 2005

### heman

Is logic & computation two sides of the same coin.?if yes,How?

2. Dec 29, 2005

### neurocomp2003

u need logic to do computation...but are you talking about the fields of study

logic is more about truths and proofs
where as computation or computability is based on 1. if you can design an algorithm to do a problem 2. is that algorithm reasonable NP v P.

3. Dec 29, 2005

### shruth

Isn't logic the set of rules that direct computation? Something like grammer and language, maybe.

4. Dec 29, 2005

### fargoth

it depends on the definition of logic, in my view logic is the ability to deduce facts from raw data.
you can use computation to deduce the facts, but you cant deduce all the facts from computation (e.g. the tiling problem).

5. Dec 29, 2005

### -Job-

An algorithm maps an input to an output. The set of all inputs that the algorithm can receive is that algorithm's domain, and the set of outputs is the range. For every algorithm A there is an equivalent mathematical function f(x) where x is the number given by the binary representation of the input.
For example, take the sorting problem, which is given a sequence of n numbers (each with a maximum of k bits). If you take the binary representation of these n numbers and append them to each other, forming a string of k*n bits, you now have your input x. Now a function f(x) only needs to produce a y which is the string formed by appending the bits given by the numbers in the sorted sequence.
For every sorting algorithm there is an equivalent function f(x). By equivalent i mean that f(x) maps the same inputs to the same outputs as the sorting algorithm. From this perspective every sorting algorithm is equivalent to the function f(x). The difference between the algorithms is only in the speed of execution, which traditional mathematics doesn't really care about. There are infinitely many f(x) but notice that a common mathematical function doesn't have flow control, which means that, every time you need to to get a y from an x, the whole function f(x) needs to be computed (you can't skip portions of the function f(x)).
What we can do is split f(x) into a bunch of statements:
x1 = f1(x)
x2 = f2(x1)
x3 = f3(x2)
....
y = fi(xi)
Now, f1, f2, f3 may be of the kind:
f(x) = <some math expression> if a<x<b
= <some constant> else

Because these satements map inputs (x) to outputs (y) in the same manner as f(x), they are "equivalent" to the function f(x). The difference is that the statements can produce a much faster speed of execution by using flow control (if/else) to avoid unnecessary computation.
Notice that computer algorithms are exactly these statements. Every algorithm has many equivalent f(x) but the algorithm speeds up execution by breaking the function f(x) into pieces (instructions).
When you ask if logic and computation are equivalent, then you are really asking whether logic and mathematics are equivalent because the field of Theory Of Computation is a branch of mathematics that is particularly interested in how fast we can map an input to an output.

6. Dec 29, 2005

### heman

Job...is it someway you mean that you break the bigger problem into smaller component and solve them seperately...
it was very interesting...can you illustrate this with one example....like how you taking fi's and how they actually speeding up ..

7. Dec 30, 2005

### Hurkyl

Staff Emeritus
Yes, formal logic and computability theory are extremely close subjects. In fact, a good fraction of what I know about formal logic has been from the computability theory viewpoint.

First, I would like to point out this quote from Wikipedia:
-Job- seemed to be mainly speaking about complexity theory, but it's computability theory that has the close relation to formal logic.

The connection1 comes from the fact that if a statement has a proof, then there exists an algorithm that can find the proof. It's a stupid algorithm, but it works.

To prove a statement P:
(1) For every possible sequence of symbols:
(2) ::: If the sequence of symbols denotes a valid proof of P.
(3) :::::: Then accept P as provable.

This assumes that there are only countably many possible symbols, and that there exists an algorithm to verify if a given statement is one of the axioms under consideration.

Conversely, if any algorithm is capable of generating a proof of P, then obviously P is provable!

This connection allows important questions about logic to be transferred to a question about Turing machines. For example:

I might ask if my theory is complete: for any P, we have that either P or ~P is a part of my theory. (I.E. provable from the axioms)

Which translates into the question about whether the following algorithm is guaranteed to terminate:

(1) Simultaneously run two copies of the above machine on P and on ~P.
(2) If the first copy terminates, accept P as provable.
(3) If the second copy terminates, reject P as disprovable.

The proof I know of Gödel's First Incompleteness Theorem was done using Turing machines.

Another topic one might consider is that of a recursive function. It turns out that this is the same thing as a computable function!

I'm sure I read a quote in a textbook to the effect that computability theory and formal logic are so tightly intertwined, that it would be a futile effort to try and mark a line to say what the difference between the two theories are.

1: Don't construe this to mean I'm convinced that this is the only connection between the two fields. :tongue2:

8. Dec 30, 2005

### -Job-

Notice also that the hardware of a computer is built upon very basic logical operations, these are the AND and the NOT:
Code (Text):
a AND b = 1 if a=b=1, 0 otherwise
NOT a = 0 if a is 1, 1 otherwise
From these we can build the OR and the XOR operations:
Code (Text):
a OR b = 0 if a=b=0, 1 otherwise
a XOR b = 0 if a=b=0 or a=b, 1 otherwise
Computers represent numbers in binary:
Code (Text):
0 = ...00000
1 = ...00001
2 = ...00010
3 = ...00011
4 = ...00100
...
What this means is that we can easily perform the above operations on these binary numbers, bitwise, for example:
Code (Text):
001 AND 101 = 001
With the AND, OR, NOT, XOR operations we can perform addition and subtraction. With addition and subtraction we can perform multiplication and division and so on. These basic operations are implemented as circuits inside the processor in a region called the ALU (Arithmetic Logic Unit).
The way you use the ALU is you pass one or two numbers to it and an operation (AND, OR, NOT, Addition, Subtraction, Multiplication...) and it gives you the output.
An ALU has a bit sequence for each operation, like:
Code (Text):
000 = AND
001 = OR
010 = NOT
...
So if you pass the numbers 2 (...010) and 3 (...011) and the operation addition(...011) to the ALU it will give you 5. This is called an instruction and every program (browser, word processor...) is converted into a sequence of these instructions, each specifying the operation, the operands and where the result should be put. For example it might look like (if processor is 32 bits):
Code (Text):
00011001001001000111100000100111
00011010111001000111100000100110
00011001001001000111100110001101
00011001001001000101110111010110
.....
Obviously it's not very easy to program with 0s and 1s, so Programming Languages were invented (C, C++, Fortran, Lisp...) which have constructs like:
Code (Text):
if <...> do <...> else do <...>
while <...> do <...>
...
These languages have associated compilers which create the corresponding machine code.
Consider an algorithm written in C. Before you can run it you have to compile it into machine code (0s and 1s). When you do run it these instructions are fed to the processor one by one which uses the ALU as needed. So you can see how every algorithm has a corresponding sequence of logical operations. This is another way how logic and computation are related.
This sequence of logical operations can be combined into a single function f(x) but it's not a simple procedure at all. If you try to conver even a simple algorithm into a mathematical function it will probably end up being a pretty large function. If in turn you were to feed that function to a computer and compare the runtime to the runtime of the sequence of instructions you'd very likely find that it would be slower by a constant factor or worse.
Computability theory is closely related to logic, but that's because it's built upon logic. If you take a Theory of Computation class you'll be discussing sets and all that stuff which i believe comes from logic. As a matter fact i hope there isn't a field of science that isn't related to logic.

Last edited: Dec 30, 2005
9. Dec 30, 2005

### scott1

Logic is the study of argument.I think logic is the secience to find out if somthing is ture or not using other data and comparing it to the current argument to detirme weather it is ture or not.

10. Dec 31, 2005

### Hurkyl

Staff Emeritus

In one text I have checked out, Mathematical Logic (Ebbinghaus, Flum, and Thomas), the word "truth" doesn't appear until page 183 out of 289. (If I can believe the index) Furthermore, the word is only used in quotes, indicating that it's only speaking loosely. (the text quickly clarifies what it really meant to say)

Formal logic starts off by discussing pure syntax -- studying grammar in a natural language would be a decent analogy. (but, of course, formal logic is much more precise, and would be easier too if we haven't spent all our lives learning grammar. ) In fact, formal logic uses words such as "language" and "grammar"!

There are all sorts of things about which one can speak purely syntactically, such as the notion of proof. A proof is merely a sequence of "calculations": you start with some collection of sentences (the hypotheses), and you produce new sentences from the old ones using a certain set of operations. (In fact we call such a set of operations a "calculus")

Then, one might speak about semantics: how one might interpret strings of symbols. E.G. if I take a language with operator symbols $\cdot, {}^{-1}, e$ which are binary, unary, and nullary respectively, I might be interested in the following set of sentences:

$$(\forall x)x \cdot x^{-1} = e$$
$$(\forall x)(\forall y)(\forall z) (x \cdot y) \cdot z = x \cdot (y \cdot z)$$
$$(\forall x) e \cdot x = x \cdot e = x$$

these are just strings of symbols. But if I can give them a meaning: I can talk about some set G, and define operations on G that correspond to the three operator symbols. Then, I might ask if this interpretation is a model of the above sentences: that is, through the correspondence I just mentioned, whether these three sentences are valid. (heuristically speaking, if they are "true" in the model)

A model of these three sentences, of course, is nothing more than a group!

One is also interested in theories: that is, sets of sentences that are closed under deduction. For example, the theory of the above three sentences I mentioned would also include sentences such as $e \cdot e = e$.

One might now want to ask questions such as:
If I can model a set of sentences, are they consistent?
If I have a consistent set of sentences, can I find a model of them?
Is a theory complete? I.E. for any sentence P, is either P or ~P in the theory?
If a sentence is true in every model of a theory, must it have a proof? (And thus be a part of the theory)
Can I have two non-isomorphic models of the same theory? Even if the theory is complete?
How can I build new models out of old ones?

Gödel's completeness theorem and two incompleteness theorems are related to these questions. The foundations of nonstandard analysis are firmly rooted in formal logic as well.

11. Dec 31, 2005

### -Job-

Hurkyl, some of the stuff you mentioned, like completeness, recognizability & corecognizability, i learned in computer classes together with Regular Expressions, Grammars, FSMs, PDAs, Turing Machines etc. I didn't realize that much of this stuff came from the field of Logic.

12. Jan 1, 2006

### honestrosewater

Would you mind elaborating? I can't imagine what formal logic would be without truth. (Though meaningless comes first to mind.) Call the objects that are usually called 'truth' and 'falsehood' (or whatever 'truth'-values you have) whatever you wish; if you don't assign those objects to your statements, well, who cares what you can prove? You can prove anything -- proofs are just strings of symbols, as you said (and you get to make up the rules generating them).

Perhaps you mean that you tend to spend more time and have more options with the syntax, or calculus, than with the semantics. (?) But truth seems to be the main reason (in addition to perhaps efficiency, convenience, tradition, beauty) for choosing some set of connectives, inference rules, or axioms over others. All connectives in classical logic are truth-functional; inference rules preserve truth; axioms are tautologous. Do you think that's a coincidence? Is any old calculus worth spending your time in?

Can you define validity, satisfaction, consistency, soundness, or completeness without the objects usually called truth-values?

Or maybe you mean that truth has special meanings in logic. (?) I'm really curious about what you have in mind, because if you asked me, I'd say that logic is about truth, proof, and their relationship.
Nope, it lies. If Amazon's search works, true first appears on page 27 and truth on page 31.

13. Jan 1, 2006

### Hurkyl

Staff Emeritus
Here are some excerpts from Mathematical Logic (Ebbinghaus, Flum, Thomas)

3.2 Definition of the Satisfaction Relation. For all interpretations $\mathfrak{J} = (\mathfrak{A}, \beta)$, we let
$\begin{array}{l @{\quad \mbox{:iff} \quad} l} \mathfrak{J} \models t_1 \equiv t_2 & \mathfrak{J}(t_1) = \mathfrak{J}(t_2) \\ \mathfrak{J} \models Rt_1 \ldots t_n & R^{\mathfrak{A}}\mathfrak{J}(t_1) \ldots \mathfrak{J}(t_n) \mbox{\quad (i.e.  R^\mathfrak{A} holds for \mathfrak{J}(t_1), \ldots, \mathfrak{J}(t_n) )} \\ \mathfrak{J} \models \neg \varphi & \mbox{not } \mathfrak{J} \models \varphi \\ \mathfrak{J} \models (\varphi \wedge \psi) & \mathfrak{J} \models \varphi \mbox{ and } \mathfrak{J} \models \psi \\ \end{array}$
(and so forth)

7.1 Definition. (a) $\Phi$ is consistent (written: $\mbox{Con } \Phi$ if and only if there is no formula $\varphi$ such that $\Phi \vdash \varphi$ and $\Phi \vdash \neg \varphi$.
(b) $\Phi$ is inconsistent (written: $\mbox{Inc } \Phi$ if and only if $\Phi$ is not consistent (that is, if there is a formula $\varphi$ such that $\Phi \vdash \varphi$ and $\Phi \vdash \neg \varphi$).

(Though, in commenting on definition 3.2, the text does say "the reader should convince himself that $\mathfrak{J} \models \varphi$ if and only if $\varphi$ becomes a true statement under the interpretation $\mathfrak{J}$.")

The text also makes a big deal that the usual operations on truth values are not the same as the logical connectives in the formal language. As far as I can tell, it only talks about them in one section, and uses symbols such as $\dot{\vee}$ and $\dot{\wedge}$ to distinguish them.

I say that logic isn't about truth because I never really see it used in anything I've read on formal logic. Certainly there is an intimate connection between boolean algebra and the usual rules of deduction ({T, F} is not the only suitable Boolean algebra), but Boolean algebra seems to be more useful for algebraic and computational tasks.

I am by no means an expert on formal logic -- this is merely the impression I've gotten from what I've studied thus far.

Formal logic does have plenty of direct mathematical use -- it has rather strong connections to things like Universal Algebra, Nonstandard Analysis, and Real Algebraic Geometry.

But as for what I think was the intent of your question, I think that I've been a formalist a long time -- I believe that mathematics makes no attempt to ascribe meaning to anything. That task is better left to other disciplines (such as physics, or even metamathematics)

Last edited: Jan 1, 2006
14. Jan 1, 2006

### honestrosewater

Okay, satisfaction is enough. (BTW, their 'extensional' is my 'truth-functional'.) The objects that I'm referring to would be assigned by what they call an assignment ($\beta$). That's what I'm asking about: the maps from the formulas to the objects that let you define, for example, how the truth-value, or semantic entailment, of a compound formula depends on the truth-value, or semantic entailment, of its constituents. Where are the objects that allow you to state that? It seems they've relegated them to the preceding discussion and to your informal understanding. But when they say, for example, $\mathfrak{J} \models \neg \varphi \mbox{ :iff not } \mathfrak{J} \models \varphi$, they're referring to the set of truth-values {T, F} that they've just finished talking about. I don't know why they don't just put them directly into the definitons -- the truth-values aren't even necessarily part of the formal language (they are part of the metalanguage, of course). I think the only requirement is that the truth-values be distinct.

My book, for example, simply uses truth-values to define satisfaction (using Ebbinghaus et al's terms): If f is a formula and J is an interpretation such that J(f) = T, we say that J satisfies f and write J |= f. And similarly for the others.

Is $\mathfrak{J} \models \varphi$ left undefined, or did I miss the definition?
Sure, the maps are part of the metalanguage. They are applied to the connectives in the formal, object language so that you can determine, for example, whether a formula you've proved is a tautology or not.
Perhaps you don't see it because it's already been worked into the calculus.
Anywho, whatever you call it, it seems necessary to me. Of course, I could be wrong -- that's why I'm asking. Do you think 'if I can prove it, is it true?' and 'if it's true, can I prove it?' are central questions in logic?
Hm, I only mean to use truth and such as they are used in formal logic. I'm not talking about 'real world truth' or anything of that sort. I'm all for the cleanest divide possible between the object language and ...everything else. If a system can do without those (in classical logic) two distinct objects that are normally called truth-values and assigned to formulas, more power to them, as far as I'm concerned. I just don't see how it's possible.

Last edited: Jan 1, 2006
15. Jan 1, 2006

### Hurkyl

Staff Emeritus
Their $\beta$ isn't a truth-assignment: it's a map from the set of variables into the domain A defined by the underlying structure.

From this, for the interpretation $\mathfrak{J} = (\mathfrak{A}, \beta)$, they define a map $\mathfrak{J}(_)$ from the terms into the domain A.

In particular, neither of these functions are defined on the set of formulas, and do not map into anything resembling a set of truth values.

The above is the definition of $\mathfrak{J} \models \varphi$: it defines it first for the atomic formulae, and then recursively for compound formulae.

If you were following their exposition, you'd have to do things the opposite way that your book does: the truth-valuation would be defined in terms of the satisfaction relation.

The two ideas, of course, naturally correspond: for any particular $\mathfrak{J}$, the satisfaction relation is merely a subset of the set of formulas, and the truth-valuation is the corresponding characteristic function.

No, I don't! Without some additional qualification, I have no idea what those two questions even mean!

Last edited: Jan 1, 2006
16. Jan 2, 2006

### honestrosewater

Sorry, I shouldn't have said anything yesterday; I thought I was in a better condition than I was. Anywho, yeah, I see.
Is it true that for all formulas f, if |= f, then |- f, and if |- f, then |= f? Does the set of tautologies equal the set of theorems?

17. Jan 3, 2006

### Hurkyl

Staff Emeritus
That I certainly find to be an important question. (Assuming that |= P means that P is a consequence of the empty set of formulas)

18. Jan 3, 2006

### honestrosewater

Yep.
Speaking loosely (hoping you understand well enough what I mean), it seems like you might be able to leave truth out of any discussion by just keeping it at least one discussion above, in a meta-level, and using it to kind of sort what you let through to the object-level (e.g. which definitions you choose).
I don't know. I'm just starting to take a new look at logic and its applications, looking more at natural language, meaning and semantics than at computation and syntax. Truth being a part of logic, even formal and symbolic logic, seems like a no-brainer, but I don't see where it enters the picture anymore.