i think you've almost answered your own question. check out both the current definitions and the historical definitions at
http://physics.nist.gov/cuu/Units/background.html
first (after the meter, kg, second) came the Ampere. it was defined to be such a current that when passed in two infinitely long and very thin parallel conductors in vacuum spaced apart by exactly 1 meter, induced a magnetic force on those conductors of exactly 2 \times 10^{-7} Newtons per meter. that is what set \mu_0 = 4 \pi \times 10^{-7} N/A^2 if \mu_0 was anything different, that force per unit length would come out different than the defined value. then, of course, the Coulomb comes out to be an Ampere-second. there is nothing magical about these choices of units, they're quite anthropocentric and might not be used in 200 years.
until 1983, the meter was defined to be the distance between the centers of two little scratch marks on a plantinum-iridium bar in Paris (and got its original definition as 10,000,000 meters from the North pole to the equator) and the speed of light was
measured to be 299792548 meters/second with some experimental error. at that time, then \epsilon_0 = \frac{1}{c^2 \mu_0} also had experimental error. but in 1983 they changed the definition of the meter to be the distance that light in a vacuum travels in 1/299792548 seconds. that, plus the fact that \mu_0 was defined, had the effect of defining \epsilon_0. someday reasonably soon, they may redefine the kilogram to effectively give Planck's constant \hbar a defined value.