The current in a circuit is directly proportional to the voltage, but only for a given value of resistance.

You're dealing with what are effectively two separate circuits (primary and secondary), each of which can have vastly different values of resistance.

Look at it this way:
If you keep the resistance of a circuit constant, then increasing the voltage will result in a corresponding increase in current (just as you would expect from Ohm's Law). You have increased both voltage and current, therefore you have also caused an increase in the dissipated power (P = I x E).

Now let's look at the transformer. Maybe an example with numbers will help. Assume you have a xfmr with a 240V secondary feeding a 2400W load. The primary side is fed with 2400V.

The current in the secondary will be:

2400W / 240V = 10A

And from Ohm's Law the resistance must be:

240V / 10A = 24 ohms.

Now let's see what's happening on the primary side. You can't get more power out of a transformer than you put into it. To get 2400W from the secondary, you therefore have to feed 2400W into the primary (*).

If the primary side is running 2400W at 2400V, then the current must be:

2400W / 2400V = 1A.

Then from Ohm's Law, the primary resistance is therefore:

2400V / 1A = 2400 ohms.

In other words, the primary resistance is a hundred times greater than the secondary. That's why even though the primary voltage is ten times greater than the secondary, the primary current is still only one-tenth of the secondary current.

(*) In practice, of course, no transformer is 100% efficient, so the output power will actually be slightly less than the input due to losses. I've also completely ignored power-factor considerations here to keep it simple.