I suppose it made a certain amount of sense (at least early on) as far as fuse ratings are concerned (reducing thermal stress on the fusible element by staying comfortably below its limit most of the time, with an occasional excursion to full capacity allowed for momentary or infrequently-used loads).
But is it the optimal
strategy for overall resource use? I'm not so sure. In Europe (as has been mentioned before) their fuses (and naturally circuit breakers too) are specified so they can
bear their rated Amperes continuously (albeit this did accordingly require somewhat more generous cable sizing with earlier fuse types); and since the bulk of Australian electrical practices (apart from the plug) originate from Europe (including Britain) albeit with local amendments here-and-there, this is naturally the case here too.
(So with our traditional 240V, the normal 10A outlet indeed supports heaters up to and including 2400W, including this one
on full power as I type; and the British with the same 240V but 13A
outlets can draw up to 3120W each.)
Certainly on circuits dedicated to a single (or 2 or 3) high-power load (air conditioners, hot water tanks, tankless water heaters, and your goliath
; and probably silliest of all, those oversized central
fan heaters a.k.a. "electric furnaces"
), I fail to see any practical benefit whatsoever to keeping this old rule.
(The cables themselves are the least-stressed part of the system, and also contain the most copper when they're run for a decent length; so surely
the European way of building the supporting hardware for continuous full load is more resourceful here
Granted, the rule itself will continue existing in America for a long while to come (at least on portable appliances, so long as old equipment is still in use); but for new
installations, I think continuous-rated circuit protection is clearly the way to go (where available).