Transformers have many different losses, each caused by different factors. In some cases, the things that you do to reduce one loss will increase others.

The greater the losses, the lower the efficiency.

Under full load, the biggest loss term will be conduction losses; the current moving through the coils causes voltage drop and heat generation. The lower the resistance of the conductors, the smaller this loss term.

But under no load at all there are still losses. Most of these get lumped together as 'core loss', but since I think motor design I tend to separate them out.

You have 'hysteresis losses'. This is the energy lost in continually changing the magnetic flux through the core. This loss depends upon the flux density (how much magnetic 'current' you have), the frequency (how quickly the flux is changing), and the mass of the core. These losses increase slowing with core saturation, since the flux itself isn't increasing once the core is saturated.

You have 'eddy current losses'. These are electric currents in the core, caused by transformer coupling between the coils and the core. You reduce this loss by increasing the electrical resistance of the core, and my laminating the core.

You have magnetization losses. This is the energy lost in primary current flowing to maintain the magnetic field in the core. This loss will increase drastically once the core starts saturating, since magnetizing current starts to go through the roof.

The above three losses are pretty much _constant_ for a fixed primary voltage, and will mean that the transformer uses power even with no load at all connected. In fact, if you increase the load, because of voltage drop on the primary side, these core losses will actually go down slightly.

The load dependant I^2R losses will increase quite rapidly as current increases, and these will make the transformer less efficient at higher loads.

You can decrease your conduction losses by decreasing the resistance of your coils. You can do this by decreasing the resistivity of the coil materials (copper rather than aluminium). You can pack more wire into the same space (square wire rather than round wire, more difficult winding techniques). You can reduce the temperature of the conductor, lowering its resistance. Or you can increase the space for windings. This latter approach means more iron in the core, and thus more iron losses.

You can decrease your saturation losses by increasing the core cross section, thus reducing the flux density. By taking the iron out of saturation you greatly reduce the magnetizing current...but the cost of a larger core is longer conductors, meaning greater conduction losses.

It just keeps being traded; going in circles.

If someone tells you that a copper transformer is more efficient than an aluminium transformer, call bull. _All other things being equal_, meaning same size conductors, same core, same insulation, etc., if you replace the aluminium with copper, then the machine will be more efficient. But all other things won't be equal. The conductors will be made smaller, the core will be made smaller, etc, to balance out the losses; the net result is that the copper transformer _might_ be more efficient.

Also one of the important design criteria is just what loading should be the point of greatest efficiency. Since to some extent you can trade off core losses and conduction losses, if a transformer is expected to operate with low loading, you will want a machine with low core losses; but for high loading you want low conduction losses. I _believe_ but have not confirmed that current practice is to design dry type local transformers to have highest efficiency at about 35% loading.

-Jon