One more time.... (Basic ohms law)
300W / 12V = 25A
300W / 120V = 2.5A
Volts down = Amps up
Voltage up = Amps down
When the resulting amperage is applied to the Voltage drop calculation...
300W / 12V = 25A
2 X 100' X 3.07 (Table 8) X 25a / 1000 = 15.35 / 12 = 1.279 = 128%VD
(its over 100%, so at some point voltage hit zero under a 300W load, the result would be very high heat and amperage. 300W / 0.001v = ?)
The same with 120 Volts:
300W / 120V = 2.5A
2 X 100' X 3.07 (Table 8) X 2.5a / 1000 = 1.535 / 120 = .01279 = 1%VD A big difference from 128%.
Now for dimmers, most modern ones are "clipping" type, as opposed to resitance type. The peak voltage is relitively the same, as it only clips off part of the sine wave, creating a dimmed effect. The part of the sine wave it clips off is the area where voltage is lowest, that way the fixture never really sees that low voltage / high amperage part of the sine wave. Thats why dimmed lamps last longer, and your dining room light doesn't self destruct. They are simular to the sine wave on this:
http://www.lutron.com/product_technical/pdf/LutronDimmingBasics.pdf But, if you used an older resistance type dimmer, or incandesant dimmer on magnetic low voltage, you might have some problems.
As for the NEC 210.19 FPN #4, although not mandatory, is a clear warning about the issue. Seeing that the NEC deals offen with voltages 120 and above it isn't so much an issue. But at 12 volts voltage drop is ten times higher, I could see it being added to 411 in the future.