Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

The original definitions of Ohm, ampere, volt and coulomb

  1. Apr 2, 2014 #1
    What's the original definition of Ohm, ampere, volt and coulomb,

    And which unit was defined according to the other ??!
  2. jcsd
  3. Apr 2, 2014 #2


    User Avatar

    Staff: Mentor

  4. Apr 2, 2014 #3

    Why didn't ampère suppose that 1 ampere is the current the passes through two wire held 1m apart, and a force of 1 newton is exerted on them why did he choose 2 x 10 to the power -6??
  5. Apr 2, 2014 #4


    User Avatar

    Staff: Mentor

    M. Ampère did not define the unit. It is merely named after him, in his honor.

    The history of electromagnetic units is a long and complicated one. Several different sets of such units have been used in the past, and to some extent even today. My head aches whenever I try to sort them out. :cry:

  6. Apr 2, 2014 #5
    The original definitions

    I just wanted to sort how these unitswere originated
  7. Apr 2, 2014 #6
    I think volt is originally derived from something else because it's not logical that both ohm and volt are derived from ampere and fitted in the equation V=RI
  8. Apr 3, 2014 #7


    User Avatar
    Science Advisor
    Gold Member

  9. Apr 3, 2014 #8


    User Avatar

    Staff: Mentor

    Also see the section on electromagnetic units in


    because the ampere and volt were (per UltrafastPED's links) originally defined in terms of CGS units.

    My head has started to ache again...
  10. Apr 3, 2014 #9


    User Avatar
    Science Advisor

    At least the ampere will soon be redefined in terms of elemental charge instead of the other way around. :approve:

  11. Apr 3, 2014 #10


    User Avatar
    Science Advisor

    The history is units is sordid and involves a political war between physicists who derived the equations and engineers who won out and defined the units. The final (I hope not.) battle was in the 1950's when the engineers packed the auditorium to institute the present set of units. At one time (long ago) the Conférence Générale des Poids et Mesures defined the volt, the ampere, and the ohm independently, so that V did not equal IR. The volt was originally defined as equal to the natural terminal voltage of a convenient voltage cell at that time. This is still approximately the size of 'one volt'. The original ampere was defined as the current in each of two long parallel wires that give a force per unit length of 2 dyne/cm. The 2 comes from the derivation of this force, and it is convenient to keep here. Because this depended on no arbitrary parameter, it was (and still is) called the 'absolute ampere'. It was the basis for the emu system of units. It is still used today (but not in elementary textbooks), and is often called the 'abampere'. For some reason, a conference in the 1880's (That is the earliest source I can find.) instituted the 'international ampere' (which is equivalent to what is now just called 'the ampere'). It was taken to be 10 abamperes. That was so we could be 30 amp fuses, and get more for our money than a 3 abamp fuse. The volt can then be defined in terms of energy, and the ohm from R=V/I.
    Factors like 10^-6 or 4pi*10^-7 come from a mismatch of cgs and MKS units, and an irrational attempt at 'rational units'. If cgs units are used consistently, then SI units are just emu units with an added factor of 1/100
    coming from 1 abamp=10 amps.
  12. Apr 18, 2014 #11
    Coloumb's constant didn't exist when coulomb brought up
    his law, but when the used units to measure force and charge and distance were changed to SI they had to put a constant in the equation?
    Last edited: Apr 18, 2014
  13. Apr 19, 2014 #12

    Meir Achuz

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    Yes, a meaningless constant.
  14. Apr 19, 2014 #13

    What about ampere ??
  15. Apr 20, 2014 #14


    User Avatar
    Science Advisor
    Gold Member

    All of the constants are scaling factors ... if the definitions of the units are changed, the scaling factors change.

    They are about as meaningful as 12 inchs = 1 foot.

    That is, they both measure distance, and an engineer has to get it right (or else things don't work), but mathematically they can be adjusted to whatever you like ... as long as your final answer is given in some appropriate units.

    So theoreticians like to set c=1 so that the equations are simpler; this means that time is now measured in "light seconds". Very nice if you like simple equations, but not convenient otherwise.

    Being a physicist-engineer, I prefer the SI units, but I know people who prefer every other possible combination. Yuck.
  16. Apr 20, 2014 #15

    But SI.made the equations more complicated as it induced constants to equations that never needed a constant
  17. Apr 20, 2014 #16


    User Avatar
    Science Advisor
    Gold Member

    Some equations have new constants, others simply changed.

    There is no single system which removes all of the constants - not that I have ever seen.

    Most modern texts stick with SI, at least until you get to graduate school and the "old professors" use the "old texts" that still use CGS. In engineering you will hopefully see nothing but SI.

    Of course there is still the much despised Imperial System ... inches, feet, pounds, pints, etc. But it is now only found in the US, and most engineering uses SI.
  18. Apr 20, 2014 #17


    User Avatar
    Science Advisor
    2016 Award

    Also most modern professors use the CGS or even better the rationalized CGS (Heaviside-Lorentz units) when it comes to the treatment of relativistic electromagnetics. Jackson rewrote his famous textbook in the SI (that's why I prefer still my old copy of the 2nd edition), but when it comes to the fully relativistic treatment he switches back to the old Gaussian system.

    The reason is very simple: The SI is made for engineering purposes and tailored for everyday use for electrical and electronic stuff with typical charges, currents and fields where the SI gives handy numbers. Another very important point is, of course, that the SI is a worldwide valid standard system of units that is as acuurately defined as possible with the current technical means we have at hand. Some of the basic units such as time and length are defined nowadays in a "natural" way by fixing some constants (here the speed of light to define lengths in terms of time units, with the second fixed with a very accurately measurable frequency corresponding to a hyperfine transition in cesium). Others are subject to be changed pretty soon.

    The most pressing issue is the unit of mass, which is still defined by some platinum-alloy cylinder kept in Paris. Comparing with the other national standards shows that despite best care in keeping this piece "clean" its mass drifts by quite some amount, and the kg is not well enough defined anymore even for the most mundane purposes. That's why many national labs of standards like NIST in the US or the PTB here in Germany work hard to get a very accurate new definition for the kg. Similar ideas apply to the Ampere.

    That's why the SI is the system of units preferred for any experimental physics, engineering and any other practical purposes having to do with measurements.

    However, the SI is not very suitable for some parts of theoretical physics and that's why theorists use units that are simply more convenient. E.g., in the SI you have two artificial constants, [itex]\epsilon_0[/itex] and [itex]\mu_0[/itex] which have no physical significance whatsoever but are just conversion constants from the SI units to more "natural units". The only fundamental constant in electromagnetism is the speed of light in vacuum. In reality it's of course a much more universal constant, because it's the limit speed of causally connectable events in (special and general) relativistic spacetime. Also the speed of light is thus, after all, just a conversion factor between units of time and space intervals. In relativistic physics it's thus very convenient to set the speed of light in the vacuum to [itex]c=1[/itex]. Then you don't have any arbitrary constants in the fundamental equations of electromagnetism, the Maxwell Equations. The only units left are those of energy (which is the same for mass and momentum) and length (which is the same for time).

    In high-energy physics, it's convenient to use GeV (giga electron volts) and fm (fermi=femto meters) for these. There it's also convenient to set the modified Planck constant [itex]\hbar=1[/itex]. Then in principle you need only one unit, e.g., GeV, and lengths and times are meausured in inverse GeV then. Usually one still uses fm and GeV for the different quantities (e.g., for the lifetime and spacial extension of fireballs of matter created in heavy-ion collisions one uses fm; for energies and momenta of particles GeV).

    The important point, of course, is that the physics doesn't change at all by the use of different units. All equations must be independent of the units used, and you can transform each quantity defined in one system of units to any other consistent system of units without changing the fundamental laws of nature. The Maxwell equations just look a bit different in the SI (hiding their relativistic symmetry behind some arbitrary constants for the definition of units that were invented for different purposes than exposing the fundamental symmetries of nature) compared to the Gaussian or the Heaviside-Lorentz system of units, but their physical content is exactly the same. You can as well do relativistic electromagnetism in the SI. Some textbooks do this. Everything becomes a bit more complicated than necessary for the poor students who have to solve problems but the physics is the same ;-).
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Similar Discussions: The original definitions of Ohm, ampere, volt and coulomb
  1. Origins of the coulomb (Replies: 1)