# Computer power

## Main Question or Discussion Point

I often hear about how solving some problem would require a computer "the size of the known universe", or something like that. Is there a "unit" of computing power?

How many units of this power would say a 1cm^3 cube of hot gas have? In other words assuming you wanted to simulate the actual atoms bouncing around, how many units would it take to simulate this perfectly?

Related Electrical Engineering News on Phys.org
I often hear about how solving some problem would require a computer "the size of the known universe", or something like that. Is there a "unit" of computing power?

How many units of this power would say a 1cm^3 cube of hot gas have? In other words assuming you wanted to simulate the actual atoms bouncing around, how many units would it take to simulate this perfectly?

I believe information theory deals with these theoretical limitations.

http://en.wikipedia.org/wiki/Information_entropy

From that article you can see there is a relationship between information processing and with the physical universe. Especially here:

"In fact, in the view of Jaynes (1957), thermodynamics should be seen as an application of Shannon's information theory: the thermodynamic entropy is interpreted as being an estimate of the amount of further Shannon information needed to define the detailed microscopic state of the system, that remains uncommunicated by a description solely in terms of the macroscopic variables of classical thermodynamics. For example, adding heat to a system increases its thermodynamic entropy because it increases the number of possible microscopic states that it could be in, thus making any complete state description longer. (See article: maximum entropy thermodynamics). Maxwell's demon can (hypothetically) reduce the thermodynamic entropy of a system by using information about the states of individual molecules; but, as Landauer (from 1961) and co-workers have shown, to function the demon himself must increase thermodynamic entropy in the process, by at least the amount of Shannon information he proposes to first acquire and store; and so the total entropy does not decrease (which resolves the paradox)."

I read a popular physics book, "decoding the universe" where the author is talking about how information and thermodynamic entropy are actually two forms of the same physical quantity, and that actually thermodynamic entropy is a special case of information entropy. It talks about the Turing Machine, and how it relates to thermodynamics. There is a definite limit on how the physical universe can represent and manipulate information, and it takes energy to do this.

I highly recommend reading this book if this topic interests you, although it won't give you detailed theory or equations to solve your particular question.