Hi, From the basics of 3 phase power, I believed that if you had a delta source which had Line Voltages and Phase Voltages of 100V, if you converted this into a star source, the Equivalent Star Line voltage would still be 100V and the equivalent star phase voltage would then be 100V/sqrt(3). Could someone confirm that this is correct? The reason I ask is because I am now learning about transformers and have been looking at a star delta transformer, and I cannot quite get my head around the primary and secondary voltages. The book I have been looking at gives an example where there is a transformer with a star primary winding and delta secondary winding, with a turns ratio of 10/1. It says the delta line voltage is 240V. It then explains that the star phase voltage is simply 10*240V = 2400V and the star line voltage is sqrt(3) * 2400V = 4157V. Surely, the star line voltage should be 2400V and the star phase voltage be 2400V/sqrt(3), as I always believed that when converting voltages from star to delta and vice versa, the line voltage magnitudes stayed the same, as explained above. If someone could shed some light on this I would be very grateful. Thanks.