Ok, I'm no electrical engineer. but I am an automation engineer with a cheme background who, by the nature of my job gets stuck tinkering around in PLC cabinets. My company has spent probably well over 200 man hours trying to secure proper documentation to run equipment which states a voltage of 480 on the nomenclature at 492. I have no Idea why we are at 492. The electrician must have done something funny with the ground. Anyway. So I have some schematics for one of the power supplies and it gives me an input voltage of "380-480V ±15%". Can someone tell me how this makes ANY statistical sense. How can you have a accepted deviation on top of a range? I've seen this before and wondered about it. But in this case the range is huge. Not to mention I don't know what to take 15% of. The max voltage? Thats 72V. A range of 308 to 552 is just plain stupid. Am I wrong? Any insight?