Dumb question: Does anything bad happen when you increase the voltage to a synchronous motor? For example, if you had a 230/240V or whatever it is motor, and ran it at, say 300V, does that cause any problems?

Motors are, obviously, different from light bulbs. Increase the voltage feeding a light bulb, and the current increases, the power increases as the square of the voltage increase (not counting the temperature dependence of the filament, which is huge), all of which results in a large increase in brightness and a very short lifetime.

A synchronous motor, on the other hand, tends to draw whatever power is needed to rotate the load. If you increase the voltage to it, it's just going to cause a slight change in the phase angle between rotating vector and the nominal "no load" position, so that after you figure in the change in power factor, it's drawing pretty much the same power as before. At least that's how is seems to me.

Beyond synchronous motors, I don't have a mental picture for other types of motors, so I don't even have a theory on how voltage effects them.

Any thoughts on any of this?


[This message has been edited by SolarPowered (edited 05-04-2006).]