The electric utility has a digital wattmeter that seems to be measuring the power usage at the high voltage side of our distribution transformer (11 kV - 420 V 3 phase, 250 kVA) via a Current Transformer. How exactly does this work? Specifically, what I'm confused about is that transformers generally obey a VxI=const. relationship (barring losses). So is this CT stepping down the current or is it stepping down the voltage? Wikipedia says this: "When current in a circuit is too high to apply directly to measuring instruments, a current transformer produces a reduced current accurately proportional to the current in the circuit, which can be conveniently connected to measuring and recording instruments. A current transformer isolates the measuring instruments from what may be very high voltage in the monitored circuit. " This sounds to me like they are saying it is reducing the current as well as reducing the Voltage. How can a transformer do both at once? What gives?