I remember reading some comments on the IEE forum about very low Ze values some months ago, and have just come across such a situation myself.

Clipped the loop tester onto the main terminals (installation & bonding isolated, of course), selected the 20-ohm range, hit the button, and presto: "0.00" on the display. Several repeated tests came up with the same result every time.

Now, the specified accuracy for the tester (Robin 4116) is 2% plus 2 digits, so all I can really say for sure is that the Ze on this service is probably less than 0.02 ohm.

Given this level of accuracy on the typical tester, it has to make me wonder what use Ze measurements are when we start getting down to such low values. That Ze I measured could be 0.02 ohm, giving a prospective fault current of 12kA. Or it could be 0.015 ohm with a PFC of 16kA, the maximum that the DNOs will usually specify by default on a single-phase supply to cover themselves, even though for many services the true value will be considerably lower. It might even be worse than that for all I know.

Even with a measured Ze of 0.04 or 0.05 ohm, the accuracy of the tester leaves sufficient scope for error that one has to wonder about the value of such measurements at these extremes with the typical loop tester.

Personally, I tend to think that the widespread adoption of digital readouts leads people to infer a degree of accuracy which is not inherent in the equipment. People see, for example, "0.05" on a display and don't stop to think that although the resolution goes down to 0.01 the accuracy is poor at the limits of the instrument's measuring capability (i.e. in this case, just the 2-digit part could mean a true value of anything between 0.03 and 0.07 -- A very significant difference).

Thoughts?