ECN Forum
Not sure if this fits in the theory category but it seems it does.

Nominal voltage such as that of a typical residence is 120/240V . Voltage is allowed a + or - 5%. It seems to me that most run around 122 or 123.

I used to be under the impression that the higher the voltage the lower the amperage. But this is a misleading fact. Yes it is a fact but only in certain instances.

Ohms law discredits this theory to a point.

Example: 120V source,12ohms, 10A load.

Same thing with higher voltage would be:
123V, 12ohms, 10.25A

Now I know that it just increases the load of the circuit by only a quarter of an amp but how does that reflect itself at the meter day in and day out?

Is this a means of getting deeper into our pockets? [Linked Image]

Your thoughts on this please.
Posted By: Anonymous Re: Could it be that the power companys are........ - 08/03/01 07:34 AM
Voltage around here seems to run 125.7. We are being robbed! No, seriously...

Scott has said before that meters measure true power. If the volts are lower, the power from the amps is lower. It's not a kiloamp-hour meter.
Perhaps a schematic for a simple meter would be in order.

At a higher voltage, a water heater will get the water hotter faster and then shut off.
Motors may run faster and finish sooner and shut off.

However, light bulbs burn brighter and hotter and burn out sooner at the higher voltage, and of course consume more energy.

In California there has been talk of lowering voltage to reduce the power consumption, so there must be a significant correlation.


[This message has been edited by Dspark (edited 08-03-2001).]
I live in Government-Owned Projects. The voltage is about 116 volts. Not only is the government getting a cheaper bill, but my 75 watt bulbs are at 70 watts. go figure.
This seems to be an unusually frequent subject in online electrical message boards.

The ANSI-standard [C84.1-1995 on your scorecards] acceptable voltage range for a 120-volt base is on the order of 110-126 volts, corresponding to about +5-8%.

C84.1 is sort of a treaty hammered out by utilities and appliance/equipment manufacturers, and gets quietly referenced in various ‘official’ documents by both groups.

Some electrical professionals miss an important aspect of power demand for voltage variations from nameplate ratings. Resistive loads [incandescent lamps & heaters] are effectively constant impedance, so that intentional or incidental increase of voltage produces higher real-power [watt] dissipation and requisite energy use. OTOH, within limits, things inductive [motors/solenoids] act more like constant apparent-power [voltampere] devices, so that—to a degree—higher voltage operation produces lower load current. The subject of voltage/power/energy relationships has been studied to a precise degree by utilities and equipment producers. Although it is generally figured {off-the-cuff} that lower voltage translates to reduced energy use, this statement is by no means chiseled in stone. In fact, one study of voltage reduction by a utility in a Scandinavian country showed—for resistance heat over a day’s time—energy use remained essentially flat with changes in voltage because the increased heater run time in building heat systems ‘compensated’ for change in the decreased instantaneous power demand. {However interesting to me, I do not work for a utility.} Utilities unofficially affirm that voltage-control studies are inconclusive.
© ECN Electrical Forums