We know that motors can tolerate = / - 10% voltage variation. That math is simple enough on a dual-voltage motor (230 / 460, 120 / 230).
What if the motor is rated 120 / 208-230. What is the actual permitted deviation at the higher voltage. Can you apply the tolerance to the whole range? For example, supplying the motor with 240 volts would still be less than 10% above 230, but it will be more than 10% above 208.
When an appliance -such as a motor- is given an operating range on the nameplate, the entire range is used.
For example, a motor rated 208-230 (on the nameplate) would have been tested at -10% of 208, and +10% of 240.
An important consideration, however, is that the testing criteria only require that nothing bad immediately happen. Long-term over-heating, inefficiency, and a dramatic loss of torque are all allowed, and not even looked for!
Dumb question: Does anything bad happen when you increase the voltage to a synchronous motor? For example, if you had a 230/240V or whatever it is motor, and ran it at, say 300V, does that cause any problems?
Motors are, obviously, different from light bulbs. Increase the voltage feeding a light bulb, and the current increases, the power increases as the square of the voltage increase (not counting the temperature dependence of the filament, which is huge), all of which results in a large increase in brightness and a very short lifetime.
A synchronous motor, on the other hand, tends to draw whatever power is needed to rotate the load. If you increase the voltage to it, it's just going to cause a slight change in the phase angle between rotating vector and the nominal "no load" position, so that after you figure in the change in power factor, it's drawing pretty much the same power as before. At least that's how is seems to me.
Beyond synchronous motors, I don't have a mental picture for other types of motors, so I don't even have a theory on how voltage effects them.
Any thoughts on any of this?
[This message has been edited by SolarPowered (edited 05-04-2006).]