Not sure if this fits in the theory category but it seems it does.

Nominal voltage such as that of a typical residence is 120/240V . Voltage is allowed a + or - 5%. It seems to me that most run around 122 or 123.

I used to be under the impression that the higher the voltage the lower the amperage. But this is a misleading fact. Yes it is a fact but only in certain instances.

Ohms law discredits this theory to a point.

Example: 120V source,12ohms, 10A load.

Same thing with higher voltage would be:
123V, 12ohms, 10.25A

Now I know that it just increases the load of the circuit by only a quarter of an amp but how does that reflect itself at the meter day in and day out?

Is this a means of getting deeper into our pockets? [Linked Image]

Your thoughts on this please.