rbj said:
The Ampere was never defined as a Coulomb per second.
Jeff Reid said:
Not only that web site, but also in a college textbook, University Physics by Sears and Zemansky (at least in the old edition I still have). Remember the point here was a laymans definition of an ampere, other than the one quoted which referenced the two long wires.
Jeff Reid said:
As previously posted, some college textbooks define Coulomb first, since this allows charge to be defined. Then Ampere is defined later, as a rate of charge flow per second. It's a more logical introduction into the physics of electronics, define charge first, then current and voltage.
And why ignore the website but still include it in a quote?
i was just quoting
you. i looked at the website which is nice, but it is
not the authority, whereas the NIST site (below)
is the authority on these definitions. Layman's definitions are fine to the point that we do not tell a lie in such a definition.
Well, for the purposes of a layman, it is okay, i guess to think of the unit charge as being defined first and then to define current as the time rate of change or movement of charge. But that leaves how to define the unit charge and the answer would be the same. A coulomb of charge is such an amount of charge so that the permeability of free space \mu_0 come out to be exactly 4 \pi \times 10^{-7} Henries per meter.
ZapperZ said:
"Could" and "is" are two entirely different things. I'm not teaching history here. What is THE definition of an Ampere NOW?
you're right, Z, but sometimes these definitions get changed as the leaders in the community of physical sciences decide what is more important to have defined and what is best left as measured. but sometimes it's good to toss in a little history and i would really recommend the OP and Jeff and anyone else interested to look at the NIST site:
http://physics.nist.gov/cuu/
and particularly
http://physics.nist.gov/cuu/Units/current.html .
For example between 1889 and 1960, the hard definition of the meter was the distance bewteen the centers of two scratch marks on a piece of iridium and the second was 1/86400 of a "day", but with changes in 1960, 1967, and 1983, the definitions of the meter and second were updated in such a way as to define the speed of light in vacuum to be
exactly 299792458 m/s. so now, the distances between those two scratch marks are
measured not
defined and comes out to be very, very close to 1 meter. and the effect on the length of day has been more noticable, given the hard definition of a second, now once in a while they have to insert leap seconds (the minute immediately before midnight 01/01/06 GMT or UTC had 61 seconds in it instead of 60) because the rotation of the Earth is very gradually slowing down.
Some day (I hope soon) they will redefined the kilogram so that it is not the mass of that prototype in Paris but will be such a mass as to make Planck's Constant \hbar an exact and defined number. This is essentially how the Ampere (and by extension, the Coulomb) have been defined. They are whatever they have to be so that the permeability of free space \mu_0 come out to be exactly 4 \pi \times 10^{-7} in SI units. But someday, they might very well redefine the Coulomb to be some exact defined number of elementary charges so that the elementary charge comes out to be exactly 1.60217653*x*10
-19 C. But that is not the case now, so that number is
measured and there is some range of error in that measurement.