I have touched this subject before, but this time I'll start a separate thread.

In many countries there is a requirement that the circuit is matched with a protective device so that a short between earth and live is cleared within 0.4 seconds (5 seconds in some cases, but that doesn't matter with MCB's)

If there isn't an RCD, this is achieved by limiting (by design) the earth fault impedance so that the breaker trips in case of a fault.

Here comes my concern: This is a far as I know calculated for the outlet, which is fine for a cooker or something that remains in place. But for a general purpose socket, the connected cord and possible extension cord can add significantly to the impedance.

(This does not apply to BS 1363 sockets since these are fused, but well to industrial type sockets.)

If I plug in 10 metres of 1.00 mm2 cord, which isn't unrealistic, I add an impedance of 0.34 ohm. How does that relate to the total impedance?

IIRC, the impedance for a TN-C service can be taken as 0.3 ohms, which should correspond roughly to a 40A supply. We allow a maximum voltage drop of 4% in the branch circuit. If the branch circuit is 20A radial with equally sized conductors (T&E cable is worse), this corresponds to a impedance of 0.46 ohms. This leaves us with a total impedance to socket of 0.76 ohms.

The resulting current is then 230V/0.76 > 300 A. This will trip a typ C breaker.

But if we add the cord, the impedance rises to 1.1 ohm and the current falls to just over 200A. Still sufficient, but only just.

Had we used a longer cord or a thinner cord, we would be below the limit.

The reasoning works for short circuits line to neutral too, where the danger is 'just' fire.