During discussions of long services, the idea of using higher voltages often comes up. And it isn't a bad idea; this is how long distance power distribution is done. My understanding of the _theory_ (no, I've not done one of these installations, and I am sure there are corrections to be made to the below
You have to consider the cost and losses of the transformers, and additionally you must remember the _impedance_ of the transformers. Each transformer has its own 'built in' voltage drop.
You can use the transformers to compensate for voltage drop, by appropriately adjusting the transformer taps; however this sort of adjustment is good for one current level only. Since the voltage drop _changes_ as the load changes, adjusting the transformer taps will not help with things like light flicker with large loads, or other problems associated with the change in voltage drop as the load changes. In power distribution systems, transformers have automatic tap changing hardware to regulate the output voltage.
1) if you can carry the primary voltage closer to the building, and put the transformer closer to the building, you will probably be better off in terms of voltage drop. You have the same transformers, and so the same transformer impedance, just arranged for better resistive losses in the conductors.
2) if you can get your supply at higher voltage, and then step down as needed, you _may_ be better off. This is especially true if you have loads that can run directly at the higher voltages. For example, you may be better off with a 480V supply, running loads directly at 480V, and then having smaller 120/208V panels from transformers.
3) if you take your low voltage service, step it up to a higher intermediate voltage, and then step it down again, you are shooting yourself in the foot. The impedance of the transformers (3 of them chained together) will more than make up for the improved voltage drop.
I know that 'voltage regulating transformers' exist at 120V, but I believe that these are ferro-resonant type transformers that work by keeping the core saturated; since the saturation doesn't change much with input voltage, the output is stabilized relative to the input.
Do voltage regulating tap changing autotransformers exist for these sort of situations? It would seem to me that a very small autotransformer could compensate for voltage drop on a very large service, small meaning a transformer of perhaps 5% the service KVA.