A transformer is rated in volt amps because it doesn't actually "consume"
power, except for a small about of losses due to magnetization.
Good point made here.
Transformers will only "Reflect" an Impedance to the Generating Device, which will result in the correct level of True Power being supplied, then transfered to the load device.
They need to be able to carry the load's complete Volt-Amp level, hence the KVA rating.
If the load's KVA rating is equal to the KW rating (100% / 1.0 P.F., or completely True Power load), then the Kilo Watts (KW) will equal Kilo Volt-Amperes (KVA) - and the Transformer's rating will be reflected this way.
If the load is not 100% True Power (1.0 P.F.), then the total Apparent Power (Volt-Amps) must be figured.
The VA figure would be True Power (Watts) and Reactive Power (Volt-Amps Reactive, or VARs).
There is a thread somewhere which explains how to calculate Apparent Power / Power Factor. It may be in this forum area (Electrical Tech. Area), or in another area.
If elsewhere, try a subject search, or maybe someone can paste a link in this thread.
Power Generating devices have True Power ratings.
Most Load devices will have True Power ratings - unless they are Reactive devices, which will either list FLA or KVA (or both).
Thanks to all for the contributions in this thread!