If you have ever seen the benchmark of an audio video receiver, you will see how the power drops off as more channels are driven at the same time. For example, a benchmark done on my receiver, the Yamaha RX-V3900 showed 189 watts into one channel, and only 88 watts into seven channels. Is anyone familiar with the factors which result in this power limiting? One possibility is that the transformer used in the power supply limits the current which can be drawn. One theory I read, is that the internal resistance of the transformer starts increasing with current draw which causes a drop in in the output voltage. This drop affects the power supply rails causing clipping to happen at a lower voltage than the normal rail voltage. Yet another possibility is that the power transformer has insufficient flux to transfer the needed current to the secondary. This implies that the transformer is saturating. I don't understand transformer saturation, so forgive me if this is not a reasonable reason for the limiting I am talking about. Another possibility is that receivers have a limiting circuit. I noticed in the service manual for my receiver that the early stages of each amplifier are connected to a different circuit than the final power transistors. This circuit takes the voltage rails as input, and outputs lines labeled +LB/+LB. Based on what little I know (and I know only basic electronics,) this could be limiting the rail voltage supplied to the early amp stages. I am interested in understanding the main reason for the power drop off as more channels are driven at the same time. If anyone has any insights, I would appreciate it.