I am trying to derive a scaling factor for an analog voltmeter for the purpose of measuring the secondary voltage of an electronic halogen transformer (EHT). http://www.ledbenchmark.com/faq/Transformers-Output-and-Compatibility.html The output voltage of these things is a "high frequency" square wave (30-100kHz) with a 100 Hz (2x line frequency) envelope (see above link). Digital multimeters often have problems measuring this unless you opt for high bandwidth, True-RMS models with a considerable price tag, so why not try an analog meter. The RMS-voltage specs of the DUT (Osram HTM70) is 11.5V. Peak of the envelope lies at 19V (checked this with an oscilloscope) which equates theoretically to a 0.605 Peak to RMS ratio, different from the 0,707 for a regular sine wave. Maybe my reasoning is flawed but I believe this can be explained by the gaps between the envelopes. The oscillator inside the transformer needs a minimum voltage to run. Tried 3 different analog AC-volt/multimeters, including 2 wide-band audio VOMs one with a BW of 1MHz and all 3 show a 10V reading. Analog meters measure the average value and apply a scaling factor of Pi/2sqrt(2) to adjust the reading to the RMS value of a pure sine wave. When the voltage of the EHT passes through the rectifier inside the analog multimeter the average value must be the same, since the positive and negative part are symmetrical, at least to my understanding. I therefore expected the same reading, which is apparently not the case. Where's the flaw in my logic?