# A question on fundamental dimensions

1. Sep 2, 2008

### janakiraman

Well i do understand about the 8 fundamental dimensions in physics, but i cannot understand what makes only the length, mass and time to commute with one another to produce derived units like force or acceleration. Why cannot we have a unit something like K^2 or sqrt(candella)?

2. Sep 2, 2008

### atyy

You can have any units you want, including sqrt(candela). Whether you use sqrt(candela) or convert to "base SI units" is a matter of convenience. The base SI units themselves are chosen because experimentalists have good ways of measuring them! Theorists often like different units, and there are many different systems of units in use, not even agreeing on the same number of base units (compare SI and cgs units).

The main restriction on weird units is that you cannot have log(meter) or exp(second). The argument of a log must always be unitless, eg. you must have log(ratio of lengths) or exp(ratio of times). Why?

Suppose, we have s=vt meters, with v in meters per second and t in seconds.
If we actually measured t to be w milliseconds, then the equation becomes s=vw/1000 meters.

No matter how complex your equation, changing units will just be multiplying the whole equation by some number. (This is only almost true, someone please correct it. I'm thinking of Maxwell's equations in SI versus CGS units.)

But suppose your equation is x=log(y meters).
If you measured y to be z centimeters, your equation becomes x=log(z meters/100) = log(z)-log(100).

So changing units is no longer multiplying your whole equation by some number, and it gets pretty bad if your equation is more complex.

There is an interesting consequence of this in information theory where

Entropy=Integral(-p(x)log(p(x)), where p(x) is a probability density.

A probability distribution has units! This means that the entropy of a probability density is meaningless (or has different meanings depending on your units). The resolution is that when we want to make sense of the entropy, we always use it in another formula called the mutual information, where there is another probability density against which we can take a ratio. Alternatively, if we believe that entropy has fundamental meaning, then nature must be discrete, so that we can use probability distributions instead of probability densities.

Last edited: Sep 2, 2008
3. Sep 2, 2008

### tiny-tim

Hi janakiraman!

We certainly have charge² (coulomb²) …

among the electric units …

the farad (unit of capacitance) is coulomb²/joule, with dimensions Q²T²/ML²,

and permeablility (henrys/metre) has dimensions ML/Q².

4. Sep 3, 2008

### janakiraman

hey tiny tim thanks for the information :)

@ atyy

well thanks for the elaborated reply. I now understand that units are mainly introduced for the experimentalists. A very simple perception which I dint understand. And i think u were telling me that u cannot just multiply a no to an eq that constitutes log and e because they deal with exponentials. Point well taken. But i really dint understand the entropy example. would be easy if its more lucid

5. Sep 3, 2008

### atyy

OK, let's see.

First, there are probability distributions which describe experiments whose outcomes are discrete. Each point on the distribution gives you the probability of a certain outcome. Probabilities do not have units, so a probability distribution is unitless, and there is no problem in taking its logarithm.

Second, there are probability densities which describe experiments whose outcomes can take a continuous range of values. Each point on the density is not the probability of an outcome. Instead, the probability that an outcome falls between a small range of values is the product of the point on the probability density p(x) and the range of values dx:

Probability=p(x)dx

If your experiment measures length x, then dx will have unit [L]. Since probability is unitless, that means p(x) has unit [1/L]. So there is a problem if our formula contains log(p(x)), where p(x) is a probability density.

The formula for entropy, which is used in statistical mechanics and information theory, contains log(p(x)). According to our reasoning, this formula should not make sense if p(x) is a probability density which may have units. Yet people still use it. Why? Because it is only an intermediate step, and a ratio of probability distributions, ie. log(p(x)/q(x)) is taken in the ultimate step.

Or, if one insists that the entropy itself makes sense, and is not just as an intermediate step, then one must believe that the outcomes of one's experiment are always discrete, not continuous.

The entropy, which is the intermediate step:
http://en.wikipedia.org/wiki/Information_entropy

The mutual information, which is the final step:
http://en.wikipedia.org/wiki/Mutual_information#Definition_of_mutual_information

These are good notes:
http://www.cscs.umich.edu/~crshalizi/prob-notes/
http://cscs.umich.edu/~crshalizi/notebooks/information-theory.html

These notes compare the entropy of probability distributions and densities:
http://ocw.mit.edu/NR/rdonlyres/Phy...73C86E-4C25-4B8D-BEF4-11FF59C54D63/0/lec6.pdf

A good resource:
http://www.math.uni-hamburg.de/home/gunesch/entropy.html

Last edited: Sep 3, 2008
6. Sep 3, 2008

### Count Iblis

There is no such thing as "fundamental units". What we have is our ignorance about the fundamental laws of physics, which is fortunately decreasing. The constants h-bar and c are conversion factors, artifacts of defining units before we understood their relations. The fact that we assign dimensions to these constants is an artifact of assigning different dimensions to space, time and mass.

In theoretical physics it is customary to put h-bar = c = 1. This amounts to measuring space, time and inverse mass in the same units. If you also put G = 1, then everything becomes dimensionless. Now, it is not clear that G = 1 is the right choice, I think some physicist prfer to put G = 8 pi. But if the fundamental theory of physics can tell us what the most natural choice of G should be, then everything would become dimensionless in an unambiguous way and then the expression:

Exp(meter) would be a well defined dimensionless number.