# Coulomb and ampere

1. May 21, 2004

### darkar

Current is the rate of change of current passing throught a point each second. 1 ampere is defined as the current in two infinitely long straight wires 1m apart in a vacuum which produces a force of 2E-7 N/m on each wire.

My question is why we choose current as base unit rather than charge?
and why the force needed is 2E-7 N/m but not 1N/m?

Thx

2. May 21, 2004

### Gokul43201

Staff Emeritus
You mean "rate of change of charge" !

That is a useful definition of an Amp, because it relates the Amp to mechnical quantities that we may be more familiar with. This is not 1 N because the expression for the force involves a constant called the permeability of space. This constant, unfortunately, is not equal to 1 (or, actually 2Pi).

Another way to define 1 Amp is the current you get when you have 1 Coulomb of charge passing through some plane every second.

3. May 21, 2004

### darkar

okay, but i got another question.
I am curious why physicist choose current to be base unit and charge to be derived unit. I need some sort of explanation.

Help is very appreciated.

4. May 22, 2004

### kuengb

Unit definitions are always based on phenomena that are the most "easy" to measure, i.e. that give the most accurate results. It's quite difficult to measure charge directly!

Those definitions also change with technical developement. Until 1946, the Ampère was defined using the electrolysis of Silver-Nitrate (AgNO_3): 1A is the current that produces 1.11800 mg silver per second at the cathode. This seems a strange way to define the unit of current, but it was the most precise technique to measure current at that time.

5. May 22, 2004

### darkar

May i know what is the difficulty in counting charge?

Here, i got another question. Why the electron is given such small charge value, that is 1.60E-19 ?

6. May 22, 2004

### GRQC

The use of current as a fundamental unit shows the implicit unification of E and B fields -- i.e. view the definitions as based on the source of fields.

The value of the units are still, however, based on the necessity of having 1C (and hence 1 A) of charge generating the interaction.

7. May 23, 2004

### kuengb

Hmm... I've never done a thing like this, but I suggest you'll need a high DC voltage source (to "produce" the charge you want to measure) which must be perfectly constant...probably isolation problems...don't know. In fact, charge measurement is often done with a galvanometer, which is sort of a measurement apparatus for small currents.

But the point is: what you need for technical application is current measurement, not charge measurement. So, say you have a perfect 1C charge on a capacity, and now you want to adjust an amperemeter with it. You know: This amount of charge flowing through per second, that's 1 Ampère. But how do you produce a constant current with your perfect 1C charge? Not possible.

The answer is quite simple: Because the unit Coulomb is much older than the electron! I don't know who used "Coulomb" first (I don't think it was Coulomb himself; that would have made him quite an arrogant guy ). What I know is that the idea of quantized charge came up around 1900 or a bit earlier. Millikan was the first one who measured it.