I am an electrician, and I recently got into some trouble on an issue. We were trouble shooting a fan motor that was rated at 240v and was only getting 208v. The more experienced electricians I was with said the amperage was going up because there wasn't enough voltage. This makes no sense to me. I get that watts=volts*amps, and that formula makes sense to them as to why the amperage is going up, but I remember in my apprenticeship class when you have a dual voltage motor you can wire the stators in either series or parallel to change the resistance and keep the output equal regardless of the voltage. So, V=IR makes sense to me. Basically every electrician I talked to, and I talked to a lot of them, including a few masters and a contractor told me I was wrong. So, I guess my questions are (1)When voltage goes up does amperage go up? and (2)Regarding NEC 430.250, why do the full load amps seem to support their views.