This is a difficult thread, where one must make a reply without getting passionate.

Just because someone makes a fancy meter (toy) that cranks out a number does not mean that either the toy or the operator are running the job.

Test results are meaningless without the ability to evaluate them. Neither the tester, nor the operator, can make those design decisions. They're simply not qualified.

One bit of information is useless, without being in context with much more information.

The 5% figure is a recommendation ... not a code requirement.
What is probably more important is the minimum voltage. NEMA standards typically call for equipment to operate at 10% under their nameplate voltage. For an appliance marked "120," this means that the appliance will operate at 106 volts, in a reasonable manner. If, under full load, you have 112 v, you don't have a problem.

Factors such as 'start up current' are not really relevant for a general purpose circuit; if it was for a specific appliance -say, an air conditioner- the matter is different.

Another cause of large voltage drop is bad connections. If the drop is pretty even across the circuits that is one thing; if most of it seems to happen at one device, then perhaps there is a problem that needs to be fixed.

For future jobs, you can reduce the problem by using a larger wire for home runs, or by using sub-panels. Voltage drop is something you need to consider. Yet, for all the fuss that has been made at the HI sites over voltage drop, I cannot help but wonder how the trade made is a century without those dang meters!