Erik,

Yes, current is directly proportional to voltage,

so long as the resistance remains constant.

Where you're going wrong is in comparing situations with vastly different values of resistance.

From Ohm's Law we have I = V / R.

<Edit>

We also have W = I x V.

</Edit>

Each of those formulae relates three of the four quantities involved, but you have to remember that all four are inter-related. That's how we can come up with other derived formulae, such as W = (I^2) x R.

Let's take an example:

A 60W light bulb on a supply of 120V will draw a current of:

I = W / V = 60 / 120 = 0.5A.

From that you can work out the resistance of the bulb's filament using Ohm's Law:

R = V / I = 120 / 0.5 = 240 ohms.

Now, if you were to increase the supply voltage a little, let's say to 140V, while that resistance remains constant both the current and the power would increase slightly:

I = V / R = 140 / 240 = 0.583A.

W = I x V = 0.583 x 140 = 81.7 watts.

(For pedants: Yes the resistance of the filament would change a little, but I'm trying to keep it simple!)

Similarly, if you lowered the voltage while the resistance stayed the same, both the current and the power would be correspondingly reduced.

Now let's look at a 60W light designed to run on a 240V supply, like we have here in England. Current will be:

I = W / V = 60 / 240 = 0.25A.

So here's where you see the proof that for the same amount of power (60W in this case), if you double the voltage you only need half as much current.

However, now work out the resistance of the filament:

R = V / I = 240 / 0.25 = 960 ohms.

That's a resistance four times greater than for the 60W 120V bulb.

So although we've doubled the voltage, the current is only half as much because the resistance is four times higher. The power has remained the same because 240V x 0.25A and 120V x 0.5A both come to 60W.

*****

Your query about a bigger load reducing the voltage has a slightly different slant.

I think what you're talking about is the case where the measured voltage at an outlet drops when you connect or increase the load, yes?

What you're seeing here is not so much a reduction of the EMF, or source of the voltage, but rather the voltage drop in the wires to the outlet.

All wires have a certain amount of resistance, and obey Ohm's Law like any other component.

Let's assume that we start with exactly 120V and that the wires connecting to some outlet have a combined resistance of 0.5 ohm. If you draw a load of, let's say, 1A from that outlet, the the voltage dropped along the cable will be:

V = I x R = 1 x 0.5 = 0.5V

So the voltage measured at the outlet will drop to 119.5 volts.

Now if you increase the load to say 10A, the voltage drop along the cable becomes much higher:

V = I x R = 10 x 0.5 = 5V.

Now you're losing 5V in the cables so the voltage you measure at the outlet will be down to 115V. It's still 120V at the source, but you're losing 5 volts in the wiring.

The voltage drop on the wires is also why heavily loaded cables get warm. In this instance, the power dissipated by the cable would be:

W = I x V = 10 x 5 = 50W.

The trick with all of these calculations is to remember that Ohm's Law (and all the derived formulae) work with both the circuit as a whole,

and with each individual part of the circuit. You just have to make sure that you use the right values for the part of the circuit in question, e.g. in the above power calculation, only 5V appears across the cables, so that's the voltage to use to work out the power lost in the cables.

The relationship between W, V, I and R is why you can use the other formulae to get the same result, e.g.

W = (I^2) x R = 10^2 x 0.5 = 50W.

Clear as mud?

******

Goof-alert:

Formula at top edited so that message reads properly.

Sorry about that!

[This message has been edited by pauluk (edited 06-15-2002).]