I built a hot wire cutter. My nichrome wire has 1.5 ohms of resistance. When I feed my hot wire from a 12 volt transformer controlled by a 120v variac, I can control the amount of voltage from the variac, therefore controlling the voltage from the transformer.
As I turn the voltage up, I notice that my hot wire cutter starts cutting foam at around 46 volts measured at the variac, which is about 4.6 volts measured coming out of the transformer.
So here are my questions.
If my wire has 1.5 ohms of resistance, what does that tell me?
Let's call the voltage from the 12v transformer 4.5 volts for simplicity's sake. Well, with 1.5 ohms and 4.5 volts, that means I've got a 3 amp load, or 13.5 watts, right?
Or does it mean that I've got 45 volts coming out of the variac, and with a 1.5 ohm resistance in the nichrome wire, that means I've got a 30 amp load, or 135 watts? That can't be right, my variac has a 10 amp fuse.
When I started, I thought that with 1.5 ohms of resistance at 12 volts, that would mean 8 amps and 96 watts. But the nichrome wire will blow apart at that amperage. That's why I have to control the voltage with a variac. So as I reduce the voltage, the amps are reduced, but the resistance stays the same...right?
I have a little problem figuring ohm's law from one side of the transformer to the other. Can somebody dial me in?
"When in doubt, short it out"