Hey I had to break out an old training manual to make sure I had the info correct but here's what I was taught.
First the purpose of insulation resistance testing is to insure the integrity of the insulation. Applying a DC voltage across the insulation medium a current of three components is set up; Capacitive charging current, dielectric absorption current and leakage current.
Leakage current is the single most important component used to determine the reliability of the wire and or equipment under test.
By applying 500 DC volts or more to the system under test and reading the leakage current in milliamps or meg-ohms insulation quality can be determined.
An operational standard (and I can't seem to find a source other than "that's what has been widely used in the industry for years") is that one meg-ohm per 1000 DC volts applied is an allowable lower limit for ordinary situations (68F and 50% humidity).
Applied test voltages are as follows, for a Maximum Rated Voltage (MRV) or 250 VAC apply no more than 500 VDC to the system under test. For and MRV of 600VAC apply 1000 VDC and for an MRV of 5000V AC to 15,000VAC apply no more than 2,500VDC.
In the case of residential wiring, I have not found an industry wide standard that is different than the one above. If you are worried about the test damaging the installed wiring you can test short length (at least 25 ft) of the wire to insure that it will not break down. And that will also establish a benchmark for reading the installed wiring.
In a residential system member that there is a bell xfmr that is directly connected to the circuit and I would disconnect that and other devices that any other device installed prior to the test.
At the end of the test be sure that the applied capacitive charge has been removed from the circuit.