Dnk,

At a given loading, the transformer is putting out X btu per hour. But this is not sufficient to determine winding temperature. btu is a measure of _heat_, and it is heat which causes temperature to change, but the actual temperature of the transformer depends upon the balance of heat gained versus heat lost.

The heat gained is pretty constant for a given load. The heat lost changes all over the place, depending upon such things as air flow, and particularly depending on the temperature; the hotter something is, the more it heats the surrounding air up, and thus the more heat lost to the surroundings. So when you load the transformer, it heats up until the heat loss naturally balances the heat gain and you reach equilibrium.

If you were to take a fan and force air through the transformer, it would run cooler. The heat gain would be the same, but more heat would be lost at any given temperature. The ambient temperature factors in to this; the hotter the surroundings, the less heat lost at any given temperature, and the higher the coil temperature.

You see the same things with conductor ampacity and temperature. The more current that flows through the wire, the more heat generated, and the hotter the wire has to run to dissipate this heat. The higher the ambient temperature, the hotter the wire has to run in order to dissipate the heat. This is why conductors need to be de-rated for operation in high temperature environments.

The transformer that you describe is running at 90% of its rated temperature rise, but at only 30% load. This sounds wrong to me, and suggests really high core losses. But I don't really have the experience to back up this hunch; I do design work in a related field (motors), but not much with transformer design. It is plausible that a particular transformer will have similar losses at both low and high load, but at least with motors the losses go up as the load goes up. I wonder if the supply voltage is on the high side, or if there is a harmonics current problem.

-Jon