We have installed a single phase 50A 120/240V service to a small building that is about 2500ft from the utility power meter. We have used two new 15 KVA 600V to 120/240V transformers to step the voltage up and then down for voltage drop. We have buried 2500 ft of 3 conductor power cable for the 600V feed to reach the load end.

The customers load normally is very small most of the year except when they need air conditioning or its very cold.

According to utility power meter they are consuming about 1400 KWH per month, much more than we would expect for the loads attached and the weather we have had.

We are not metered or billed for the fairly significant reactive power componant that would be evident in this type of arrangement but I suspect poor power factor could be affecting the overall efficiency of the setup.

My reasoning is that the efficiency of a typical distribution transformer may be affected when the input line has a poor power factor. At this light loading this would be the case.

I am looking for discussion regarding how a transformer input with a poor power factor and thus a distorted waveform could raise its core and winding losses.

If this is found to be a factor than we can mitigate with a small capacitor in the system sending unity power to the load transformer.



We notice that the step down transformer on the load end is quite hot to the touch. Both transformers are new epoxy encapsulated outdoor units by ACME or REX I forget which.

We plan to visit the installation to measure the
power factor of the 600V circuit loaded and unloaded.

If anyone has advice as to if we are on the right track or have any useful suggestions or measurement it would be an informative discussion.