# What is computation, really?

Tags:
1. Jul 29, 2015

### Mr Davis 97

So I've delved into programming, and gotten experienced with the fundamentals. However, the more I learn, the more I question the central object of comp. science, computation, and its foundation. According to Wikipedia, "Computation is any type of calculation that follows a well-defined model understood and expressed as, for example, an algorithm." This is well enough, but I am interested in the concept of the mathematical function and its relation to computation. During the incipient phase of computer science, Turing and many others seemed to characterize computability in terms of functions. For example, the Church-Turing thesis is stated as saying that there is no effective model of computing that can compute more mathematical functions than a Turing machine. This seems to imply that all computations involve the mathematical idea of functions (perhaps this is due to that computing involves algorithms, and functions can effectively model algorithms).

Essentially, my question is, is the notion of computation wholly related to functions? Are the two terms somehow synonymously related?

2. Jul 29, 2015

### Dr. Courtney

An awful lot of ot certainly is, especially as you think of it in regard to solving physics problems. Most would agree that a good computer program should give the same output every time for a given input.

But certain probability related computer "functions" actually fail to fit the math definition of a function. Consider a random number generator. You can pretty easily use one to generate a sequence with a well-defined mean and standard deviation, but unpredictable regarding what the output of the next call will be. It may be doing exactly as expected, but no longer satisfying the strict math definition of function (one output for any given input in the domain).

3. Jul 30, 2015

### Jarvis323

But there is no known true random number generator. And all computer algorithms for random number generation are only pseudo-random number generators, and indeed do have only one output for each input. But this is a good example, because I believe that random number generation is something that a Turing Machine cannot do, and may well be considered computation, but the question remains whether it can be done at all.

And this is a fascinating topic, because if true randomness is not possible, then it could imply that every event that has or will ever take place has always been predetermined.

Last edited: Jul 30, 2015
4. Jul 30, 2015

### FactChecker

For all practical purposes, this is not true. Not all random number generators are started with a known seed. Even the pseudo-random number generators are often not used in a way that makes their number stream predictable or repeatable.

5. Jul 30, 2015

### fluidistic

All chess programs I know that can use multiple cores are non deterministic when using more than 1 thread. I.e. given a chess position, they will return a different evaluation every single time they are ran. On the other hand, when using 1 thread they are completly deterministic.

Now I don't know if the randomness is "true" or not. I believe that the output of these chess programs somehow depend on the temperature of the processors (and this isn't deterministic), because it depends on the speed of the processors which also depends on their temperature. I would tend to think that multi threading in chess programs at least, leads to true randomness in the output (for example evaluation, or total nodes searched for a given depth, etc.) I did some tests of randomness with these kinds of data and they couldn't spot any predictive behavior, though it doesn't mean that the data was truly random of course.

6. Jul 30, 2015

### William White

I think ERNIE is a true random number generator

7. Jul 30, 2015

### Jarvis323

In computability theory, what things seam like, or what is practical is irrelevant. Even processes which would require more memory than could be constructed with all of the materials in the universe, and more time than the age of the universe, may be considered computable, while something we can very easily approximate with extreme precision may not be considered computable.

And mathematically, even ERNIE's random number generation is a function, as well as chess moves in a multi-threaded chess game. All of the input is input, including voltages, temperatures, or whatever else has an influence on the outcome.

The question of whether or not true random number generators exist is much more fundamental, and ultimately is a potentially unanswerable physics question.

Last edited: Jul 30, 2015
8. Jul 30, 2015

### Jarvis323

If you want to learn more about computability and random numbers, then I suggest reading this excellent post from the theoretical computer science stack exchange.

http://cstheory.stackexchange.com/questions/1263/truly-random-number-generator-turing-computable

9. Jul 30, 2015

### FactChecker

The result of an unknown function of unknown inputs must be considered unknown and, to some extent, random.

10. Jul 30, 2015

### aikismos

In regards to the @Mr Davis 97 post:

So I've delved into programming, and gotten experienced with the fundamentals. However, the more I learn, the more I question the central object of comp. science, computation, and its foundation. According to Wikipedia, "Computation is any type of calculation that follows a well-defined model understood and expressed as, for example, an algorithm." This is well enough, but I am interested in the concept of the mathematical function and its relation to computation. During the incipient phase of computer science, Turing and many others seemed to characterize computability in terms of functions. For example, the Church-Turing thesis is stated as saying that there is no effective model of computing that can compute more mathematical functions than a Turing machine. This seems to imply that all computations involve the mathematical idea of functions (perhaps this is due to that computing involves algorithms, and functions can effectively model algorithms).

Essentially, my question is, is the notion of computation wholly related to functions? Are the two terms somehow synonymously related?

Wikipedia is a tertiary source, and as such, I think you have to take their definition with a grain of salt. Computation is better defined IMHO as any system which embodies the information processing cycle, typically information input into an information-tight system is stored and changed (or processed) inside and eventually output. This is an important distinction because I consider well-defined a rather strict formality which tends to undermine the inclusion of the human brain as a computer. (Admittedly an ontological position I have chosen.)

The real question of why the use of functions as a model in computing should probably start of with us taking notice that the first computer scientists were mathematicians. The ancient Greeks, Babbage, Lovelace, Zuse, Von Neuman (just to list some of the earliest pioneers) were all mathematicians in various degrees. In fact, the formalism that describes the basis of digital computation in use (1's and 0's) was laid out by mathematician George Boole in the mid-nineteenth century (for which he has been immortalized every time one of us code-slingers declares Dim bln_Temp as Boolean or some such thing... :) ) But, a previous poster was spot on when he said that functions by definition are mathematical constructs that reliably produce the same output given the same input. While certainly you can talk about non-deterministic computation to show that not all computers are deterministic and predictable (who wants a calculator where 2 + 2 maps to 4 only some of the time?), most of us practical schmoes prefer to know that what we get out of our code is logically consistent with our inputs. This is contrast to the notion of a non-functional relation which an input can return two or more outputs.

In fact, in common parlance sometimes function is used interchangeable with procedure or as routine, but they are not the same, and some programming languages strictly enforce a difference in definition. To be precise, a routine is any group of instructions that are stored in a separate place and can be called repeatedly. Routines come in two flavors, the procedure (a proc or sub) and a function. Some programming languages, such as Visual Basic, strictly enforce the difference requiring a distinct definition for each (sub and function in this example). Procedures usually accept input, but return no output, whereas functions do return a value. So, I might create a procedure called void PrintOutput(int Flag) which accepts an integer called Flag as input and then takes data and sends it out to the screen, but doesn't return anything to the code in which it's being executed. Contrast that with int Add(int Addend, int Adder) which accepts two integers and returns the sum. The former is a procedure and the latter is a function, both of which are routines.

So, it's not hard to see how someone who is used to doing $2 + 2 = 4$ and then $f_{+}(2,2) = 4$ would want to generalize that to int Add(int Addend, int Adder) and call both of them functions.

Moderator action: Some off-topic content was redacted from this post.

Last edited by a moderator: Aug 1, 2015
11. Jul 30, 2015

### Jarvis323

Moderator action: Some off-topic content was redacted from this post.

A function maps a specific point in the domain to one and only one point in the co-domain. This is something that random number generators seam to violate. My point is that all influences, or variables which play a role in the calculation must be considered input. One perceived point in the domain may truly be two different points in the domain, which map to different outputs. This is trivial when it comes to pseudo-random number generators used in modern computers.

Now the question is whether there could exist a random number generator which violates the rules of a function, and how far to go in including any possible influencing factors as components of points in the domain? Well the first question is actually a very bad question, because any real random number generator will violate the rules of a function. So the question is really could a random number generator exist. As for the second question, you can go as far as considering the entire state of the universe as the point in the domain, and the question is ultimately (when taken to the full extent, to find the most fundamental, hard, mathematical truth), whether or not the universe is deterministic or not. I hope this makes it more clear what I was trying to say.

Here is a simple example that should help clarify my main point,

Code (C):

int someFunc( int x ) {
static int z = 0;
z += x;
return z;
}

The above is a function mathematically, even though it will produce different output for different calls, with the same argument, x.

Jargon is necessary, as English is ambiguous, and Mathematics requires precise definitions in specific contexts.

Last edited by a moderator: Aug 1, 2015
12. Jul 31, 2015

### FactChecker

Probability and random behavior is intimately tied to information theory. Any time there is incomplete information and/or knowledge, there can be "random" behavior. Even given the most information possible, the quantum level presents uncertain random behavior that can not be removed. At the larger, non-quantum, level, there are certainly ways to computer-generate random numbers where the user does not have enough information to predict them.

13. Aug 1, 2015

### D H

Staff Emeritus

The concept of computation is highly technical. Demands for an "explain it like I'm five" explanation is off-topic.

14. Aug 1, 2015

### aikismos

@tAllan

Actually most function calls to random number generators are not mathematical functions because they don't have informational inputs. The physical inputs of the system are irrelevant when considering the flow of information into and out of the information system.

jtv

15. Aug 1, 2015

### Jarvis323

I don't follow. If information derived from a physical process doesn't flow into the system, then how does it affect the way the system calculates? Besides, all input to any computer is physical at the lowest level anyways. In fact, your argument implies that mathematical functions don't exist at all in the physical world, which might be an interesting metaphysics topic, but is a hard concept to incorporate into a discussion about mathematical models of computation.

Pure mathematical functions are independent from the constrains of physical reality. When considering whether the random number generator is a function, it comes down to whether it is a deterministic process. If it is, then you can certainly model the process as a function. All of this doesn't depend on syntax and code, how the process is expressed in a programming language, or in what form information is stored or transferred.

Last edited: Aug 1, 2015
16. Aug 1, 2015

### FactChecker

People as smart as Einstein believed that everything is deterministic. He was only proven wrong by experiments in quantum theory. The assumption that computer calculations are deterministic functions has less certainty now that quantum effects are becoming important in computer design. Error correcting code, fault tolerant circuits, and quantum computers are important now and in the future.

But even if we assume that everything is a deterministic function, there is a mathematical distinction between functions where we know all the inputs and functions where we do not. We often have incomplete information. There are mathematical fields (Bayesian) to study the adjustments that should be made in probabilities when our information is incomplete and new information is obtained.

Getting back to the original question, there is a lot of computer Monte Carlo simulation where we are trying to model random, non-deterministic behavior. We want to get a representative sample of results given the same known inputs. So if it is a function, that is a failure of the computer model.

17. Aug 1, 2015

### Jarvis323

@FactChecker, Quantum theory uses a non-deterministic model, but does not prove the universe is non-deterministic. At least that is my understanding from what I have read. There are some good threads on this forum about this issue you might want to read.

Regarding what the user does or doesn't know, this is not relevant in pure maths and theoretical computer science. It may be relevant in applied statistics.

Last, you still are not recognizing the difference between pseudo-random and random, which is fundamental to the discussion we are trying to have.

Most pseudo-random number generators generators are seeded with some number, and given the seed, a sequence of numbers is generated. The entire sequence is a function of the seed value. Then grabbing the next pseudo random number os just indexing into the predetermined sequence.

Last edited: Aug 1, 2015
18. Aug 2, 2015

### aikismos

@tAllan,

Your question about the relationship between mathematical functions and the physical layer which gives rise to them is challenging because it rests on some philosophically challenging ideas. In the most general sense, yours is a question about how what we perceive as physical is interlinked to what we perceive as abstract; I specifically using perceive because in fact what we discuss about the universe is subject to infinite regress since as biological computers, essentially we are trying to model how we ourselves function. (See https://en.wikipedia.org/wiki/Gödel,_Escher,_Bach). I'm stating this as a preface because it has become somewhat chic to view the physical universe as information itself or some large simulation (See https://en.wikipedia.org/wiki/Programming_the_Universe) which gnaws at me the wrong way since it is not a falsifiable hypothesis, and I consider Lloyd's book an imaginative piece of science fiction until someone otherwise persuades me I'm a simulation. I also see such navel gazing as mistaking introspection about our thoughts which represent the physical universe for the physical universe itself, and I personally draw a very sharp line between what is symbol and what is medium because I believe abstraction to be an emergent property and information to be irreducible to the medium it is encoded in. This makes me a dualist, but not one who believes there is no interaction between the physical and the abstract. So let's get into my response now that you understand where I come from.

First, functions are a very specific type mathematical construct which is by definition a symbol to represent relationships between symbols in a very specific way. A function is a relation or rule between sets, the domain and codomain, where each informational input is mapped to exactly one output. Note well that this is a very specific set of words whose meaning deals in abstract entities (See https://en.wikipedia.org/wiki/The_Meaning_of_Meaning). In other words, 'function' is a symbol which we as thinkers (reference) use to name the definition above (referent). You say that my denial of your potential characterization of a function as allowing for physical inputs creating informational outputs might preclude functions from existing, but the opposite is true. I oppose the definition of a function to include physical inputs precisely because it muddies the definitions of what is physical and what is informational, and it is such a definition that would lead to contradictions. Follow me along if you please to see why.

Let us say we are aware of physical and abstract entities and that each are subject to stasis and dynamics. In fact, they exist as in parallel forms. Let's examine form one, the 'physiochemical processing system'. The car is an easy instantiation of this whereby gasoline can be added (physical and chemical input) to a gas tank (physical and chemical storage) where a fuel pump moves it to the cylinders for combustion (chemical processing) which moves the pistons (physical processing) which ultimately results in the circular motion of the flywheel and tires (physical output) and exhaust (chemical output). This entire system is understood very well by automotive engineers who are expert in the physics and chemistry and the applied sciences (materials, e.g.) to prototype and perfect a reliable car. Now, form two, the 'information processing system' is analogous. As I set here typing away at my keyboard, the physical keystrokes are converted to electrical signals (input) and are fed by a satellite microprocessor into the laptop's RAM (storage) where they are eventually packaged and sent through the IP protocol stack down to the physical layer (processing), and transmitted out my Wi-Fi card to the switch/firewall/router (output) via EM waves. Both of these systems, or in the abstract cycles, have a basic conceptual symmetry to them, that which we 'change' (processing) and 'stasis' (storage) in regards to the boundaries of a system. But as a dualist, I have to call attention to a fundamental difference in their nature. I cannot put gasoline into a car and get information as a direct output (though I could measure and gather data on the process), and I cannot put keystrokes into my computer and get a laptop to move (though I could use a computer to control the car). More simply put, cars deal in physical inputs (gasoline, air, electrons for ignition) and the arithmetic, logical, and control unit of my CPU deals in '1's and '0's as encoded by electrons. The first system deals with a medium, but the second system deals with a medium and an encoded message!

So, if you now accept the dualist premise that the abstract is more than just physical, that it somehow emerges from the physical, and is not reducible in the sense that it has the same meaning (a positive voltage may represent a '1', but it is not a '1' because '1' is an abstract idea that our brains have evolved to use for communication), then you're ready to see my object to blurring the definitions of the two to consider physical inputs into a computer actual 'symbols' for output. Not to sound contrarian to McCluhan's political statement, but the medium (the physical stuff) is not the message (the informational stuff). If it were, cryptologists wouldn't be needed, right? Capture the encrypted disc (medium) would yield the contents (message) because they're really the same thing! Fortunately for privacy's sake, this is not true. In fact, the practicalities of encoding information onto a medium have been mathematized quite nicely by Claude Shannon and others. (See https://en.wikipedia.org/wiki/Information_theory). Now, let me address your idea that physical inputs into an information system is possible but contradictory to the ontology which undergirds information theory.

Some may not appreciate the extent that information theory and work on noise in channels has influenced computer science, but the modern digital computer is predicated on ideas that come from it, primarily in the form of encoding and error-correcting codes (ECC). Were we not able to move the bits (the word coined by Shannon) around inside the computer reliably, then we wouldn't be able get reliable output of a digital computer at all. Bits are moved around inside a machine quickly and large volumes, so even small amounts of error in the input, storage, process, and output of bits would be disastrous for even a primitive OS. Remember that even a single wrong bit in an opcode in a short assembly of instructions would cause a system to crash. But what is that the ECC does exactly? It fights informational entropy, or the tendency of physical entropy and other irregularities to spontaneously change the medium and thereby alter the message against our wishes. Communication theorists call this 'noise'. Noise is nothing more than physical perturbations in a medium which affect the encoded information.

So, at last we come to the mechanics of my objection. Your conception of physical characteristics of the medium which encodes a function are intuition-wise accurate. Yes, all information is embodied by a
physical phenomena (encoding). Yes, those physical phenomena are responsible for the input, storage, processing, and output of information (physical processing underlies information processing). Yes, even without physical input, information can be created or changed (noise). True random number generators use physical processes (such as thermodynamic changes) to generate information, so an entire industry already exists in using physical inputs to generate informational outputs, and yes, the uniqueness of the state of the system could be used to output a single output, but it appears to me to be a semantic issue insofar as it is a violation of the definition of a function. "A function is a relation or rule between sets, the domain and codomain, where each informational input is mapped to exactly one output." The inputs have to be informational, that is to say, the inputs have to be encoded before they enter the system and not created within the boundaries of the system. It may seem hair-splitting, but I think it goes to the question of intentionality of design and the definitions of noise and message.

If you start considering all physical perturbations of a system as informational inputs, then all sorts of semantic contradictions begin to arise. If physical entities are information, then information is the same as the physical and no difference exists between physical and information. All stuff is idea, and all idea is stuff, and that invalidates the analytical definitions of everything pertaining to the physical and abstract, and renders all sorts of reasoning meaningless, contradictory, and/or tautological. It muddles the logical edifice that all of these disciplines are built on because fundamentally it ignores the phenomenon of encoding.

To wit, examine the following claims: The quantity of mass of a rock isn't encoded by measurement, but it exists inherent in the rock itself. The temperature of the granite isn't something measured, but just exists and is self-evident. The length of the rock is part of the rock, and not a number derived from comparison with a standard such as a meter stick. And it's hard for people who believe in an external reality not to object to these ambiguous statements, because we say quantity, temperature, and length are objective measures constantly in science, but if they are part of the object, if the temperature Das Ding-an-Dich (thing-in-itself) is inherent, how is it possible to have two people measure it and get different numbers??? It is better to shed such certainty and realize that external reality comes to us through a perceptual filter itself built of external reality and not accessible to introspection. It's better to model external reality by realizing that our measurement of 'temperature' is an internal process, that labeling it 'temperature' is an informational process, and that what we perceive as 'temperature' is a symbol for a referent which is the average kinetic velocity of molecules in a mass; 'temperature' is a practical abstraction liable to errors in measurement (hence the concepts of accuracy and precision). Everything you might think you observe, taste, feel, and touch are nothing more than psychological perceptions subject to distortion and bias, and that is precisely why science, empiricism, skepticism, and related ilk are recent and highly fruitful innovations. They recognize this. The whole scientific method is built around the dichotomy of physical (external) and abstract (internal), and the moment you blur them together by theory and practice, you set thinking back to magic.

So, I hope you see that I object to your characterization not because it is immediately far-fetched or contradictory. It actually seems like a pragmatic way to conceptualize a function, but the problem is, it puts a small leak in the hull of thinking that eventually sinks the ship. In principle when you start muddling the nature of that which is physical from what is abstract, you undermine the intersubjective process by which STEM arrives at reliable knowledge.

Last edited: Aug 2, 2015
19. Aug 2, 2015

### Jarvis323

But it's not necessary to even acknowledge any notion of physical in the abstract model of the process, when specified as a function. The "physical input" is not physical in it's mathematical form, it's simply collections of values, real numbers.

Last edited: Aug 2, 2015
20. Aug 2, 2015

### aikismos

Precisely why physical inputs cannot be defined as inputs of a function. Physical input cannot become mathematical input unless it is either encoded by the system or is noise in the system. That's why when you said "A function maps a specific point in the domain to one and only one point in the co-domain. This is something that random number generators seam to violate," you were correct. A true hardware random number generator (HRNG) does not need to embody a mathematical function; it can be designed as a routine which outputs information with no input of "information" per se. It creates information from the physical entropy (which is NOT information). That's what my response asserts. That what you keep calling "information" is NOT information until it is encoded into an information system. The equations of thermodynamics are NOT what they represent. They are set of rules (and hence information) only in our minds. Particles do NOT have information, but rather information exists in our mind to describe particles. To think otherwise is to confuse thinking for external reality (a confusion which is quite popular these days).

21. Aug 2, 2015

### Fooality

The problem with those discussions about whether the universe is or is not deterministic, is they can't be falsified, and thus are useless in discussing science. To see it, imagine I had a black box which was by definition non-deterministic, its outputs have no correlations with its inputs in a way shape or form. You could always argue that there is some set of laws which describe the outputs from the inputs, we just don't know what they are, and the argument would be plausible without the a priori definition of non-determinism. Similarly, assume we are in a by definition deterministic universe. To prove that it actually is, requires omniscience: One must show that in every case, the same precedent state always leads to the same outcome. To totally rule out all non-deterministic events, we must be omniscient.

So what we do or do not know does matter in terms of practical philosophy. Newtonian physics for instance is still a really wonderful and powerful system, if you just explicitly acknowledge what it doesn't cover, what it leaves us not knowing. But in terms of absolutes, its wrong because its not 100% right, like every other modern physics theory. But when I see the Mars Rover, I am struck by how much scientists have achieved with "wrong" theories like QM and relativity. Making space for what we don't know, what theories don't cover, seems wise in science.

My preferred definition of randomness is the algorithmic randomness of Chaitin, explored also by Kolmogorov:
https://www.cs.auckland.ac.nz/~chaitin/sciamer.html

"Although randomness can be precisely defined and can even be measured, a given number cannot be proved to be random. This enigma establishes a limit to what is possible in mathematics."

That quote from the top makes explicit the relationship between what we don't know (including pseudorandom/chaotic sequences we don't understand the generators for) and true, absolute randomness: We can't, in many cases, tell them apart.

22. Aug 5, 2015

### eltodesukane

A computation takes an input and produces an output. So this is a function.

23. Aug 5, 2015

### William White

English is as precise as you want to it be.

If you think English is ambiguous, that is because you are not using it properly.

I accept there is (sometimes) a need for technical terms if it makes a statement more concise and/or easier to understand.

However, some people to use jargon for the opposite effect - to make a statement less concise and more ambiguous, or just plain nonsense.

Somebody responded to a post of mine with

"....any non-organically compromised Homo sapien sapien..."

It is absolutely meaningless. This sort of nonsense needs to stop if scientists want the public, and other scientists to understand what they are working on. If you are trying to convince somebody that your opinion or point of view is the correct one - or even worth contemplating - is it not better to ensure clarity of thought?

This is not about the grammar police, or the spelling nazis, it is about clarity of thought and clarity of communication. The best communicators (in any field) use plain language and straight talking.

Why can the OED define computation in five words?

24. Aug 5, 2015

### newjerseyrunner

All language is relative to the parties using it.

Consider a set of numbers that gets changed from 0328761954 to 0123456789. What happened in one sentence?

Computer Scientist: The set went from chaotic to ordered.
Physicist: The set went from ordered to chaotic.

So without a paragraph defining what chaotic and ordered means, it's not possible to convey what happened easily to everyone using the same sentence. The programmer considered "something ordered" only if it's some defined interesting order. The physicist defined the starting state as ordered, regardless of whether that order is interesting or not. When you have words with multiple definitions, a language becomes ambiguous. That's why the tax code fills a library and the standard model fits on an index card. Before about 1600, a lot of math was written in natural language, it was codified for a reason.

25. Aug 5, 2015

### William White

You have made an example that does not need jargon though?

You could explain the meaning of those two sequences (the meaning you want to convey) using plain language, very easily.

If you think a sentence you write is ambiguous that is your fault and not the language.

Clear communication is a skill. That lot of people in the "STEM community" lack.