## transformer output indepedent of core permeability

I am having trouble pinning down why the relative permeability of a transformer core will not directly affect the output voltage. In fact the voltage is determined by the turns ratio and is independent of mu of core material.

Given this where is the flaw in the following reasoning:
Induced field strength H is proportional to current in primary coil, H=k I (to be exact , integral H*dl = If+dD/dt)
Magnetic flux density in core is B=mu H
Voltage in single turn of secondary is proportional to rate of change of B; V=A dB/dt = A mu dH/dt = A mu k dI/dt

From this (apparently specious) reasoning, output voltage depends on mu.

 PhysOrg.com physics news on PhysOrg.com >> Iron-platinum alloys could be new-generation hard drives>> Lab sets a new record for creating heralded photons>> Breakthrough calls time on bootleg booze
 It's because, in the normal way one uses a transformer, H decreases as µ increases, to keep B constant. The point is that the primary winding sees B and the flux just as the secondary does. In a good transformer, the primary voltage makes the flux, with very little losses, and the secondary sees the same flux - except that they multiply it by their own numbers of turns. Now if you have a very bad transformer which is inefficient at producing B and loses most primary voltage in the resistance of its copper windings, then H would be more constant than B, and µ would increase B and the secondary voltage.
 thanks that rings true. it opens the question why I am setting B instead of H; i've seen maxwell equation formulations with D, H instead of E, B but I suppose what you are saying is that the E,B form is 'basic' and the D,H forms are 'dependent' theron. ie when i run current thru a wire i determine B, and if i bring a piece of iron or such near, the H is dependent on the extant B and mu of the material