You probably won't a see very large change in efficiency; in fact for motors that are lightly loaded, the reduced voltage will actually _increase_ machine efficiency.
Basically a motor is like a transformer (discussed recently). In fact, you can think of a motor as a transformer with a secondary that moves. Motors will have voltage dependant core losses and load dependant conduction losses.
As the applied voltage goes up, the magnetic field strength and the core losses go up. But the load associated conduction losses go _down_. For any given load, there will be an ideal voltage that will provide best efficiency. For low torque mechanical loads, the 'idea' voltage is lower than for full torque operation. If the plant has lots of oversized, lightly loaded motors, then this reduced voltage is probably a good thing.
But the difference in efficiency between ideal voltage and say 15% off is not all that great. I just did a sim on some motor design software that I have access to. On a 20hp 4 pole induction motor, the efficiency at 480V was 91%, and at 300V it was 85%. The voltage was off by 37%, but the efficiency only dropped by 6%. The efficiency change from 424V and 480V is slight indeed.
But if you look at the _losses_, you might see something interesting. In the example above, the losses went from 9% to 15%. This might mean a 500W difference in input power, out of a total of perhaps 17kW input. That 500W is 4000kWh per year, so even though it is a small percentage of the total power consumption, you can still be talking big bucks. Also the small change in efficiency means 50% more heat being produced in the motor. The motor temperature will be higher, and the motor life will be reduced, considerably. Sine most of the input electricity is still going to the mechanical output, the overall efficiency doesn't change much, but the _heating_ and temperature rise of the motor will change by a large margin.