IMHO we are going to have to be very careful with wording and reading to make this a useful conversation; there are several subtle issues here. I've been working on this post for about an hour, running around in circles.

1) Voltage drop change with ambient temperature.

2) Wire size change with ambient temperature.

As I read DiverDan's posts, I believe that he is saying that the ampacity derating mandated for elevated temperature in table 310.16 is caused primarily by 1).

I disagree with this. When a conductor is operated at maximum ampacity, the internal self heating of the conductor raises the temperature to the maximum allowed by the insulation system. Once the wire is heated up to its operating temperature, it is at its operating temperature, _not_ at ambient temperature. The resistance of the copper wire is _not_ dependant upon the ambient temperature; it is dependant upon the _copper wire_ temperature. This means that for any given conductor insulation system, the resistance of the copper _at maximum ampacity_ can be taken as a constant.

Conductor ampacity is rated at 30C ambient with a given maximum allowed conductor temperature.

If we presume 90C conductors, then operation at 60C requires an ampacity correction of 0.71.

In both cases the conductor itself is presumed to be at 90C.

This is a reduction in allowed current capacity of 1/root(2) and a reduction in allowed internal self heating of 1/2.

Now let us look at the temperature change of resistance.
Let us presume a conductor that is not subject to significant self heating. This, for example, would be a long conductor sized for voltage drop rather than for maximum ampacity. The change in resistance of this conductor between 30C and 60C is not insignificant:
The temperature correction for resistance is
R2 = R1 [ 1 + K(T2 - 75) ] which is an approximate linear correction for resistances tabulated at 75C
At 30C the resistance of the conductor is 85% of its 75C value
At 60C the resistance of the conductor is 95% of its 75C value

In other words, comparing the same conductor at 30C versus 60C, the voltage drop at 60C will be about 12% greater. (meaning that if you had a 3% VD at 30C, you would expect a 3.36% VD at 60C.) The power dissipated per unit length at the same current would be 12% greater.

But the thermal ampacity correction shows that the heat dissipation capability of the conductor is far more significant.

Now jumping over to Bob's example. The answer to Bob's question would incorporate a number of factors which would totally hide the point of the discussion, with wire size being set by OCPD requirements (125% requirements), and factors of derating for wire temperature rating versus terminal temperature rating, maximum ampacity of heater circuits, etc. I'm not going to answer the question as stated. Instead I will focus entirely on the NEC 310.16 ampacities and use Bob's question as the basis.

Say we have 75C wire and 75C terminations, and the conductor must safely carry 100A. Without considering ambient thermal issues, table 310.16 says that a #3 conductor can be used.

At 0F, the temperature correction factor for 75C wire is 1.05, meaning that the ampacity of this #3 wire is 105A; we are just fine. In fact, the table doesn't go down to 0F; at 0F the temperature correction factor is probably on the order of 1.4!

At 100F, the temperature correction factor for 75C wire is 0.88. We now must use a #2 conductor with a derated ampacity of 102A.

Now let us consider voltage drop.

At 100F, using the #2 conductors, we are quite close to the thermal ampacity of the wire, so we can assume that the wire is nearly at 75C. The resistive component of the voltage drop will be:
(100A * 0.194 Ohm/kFT * 0.4kFT) / 240V = 3.23%

At 0F, using the #3 conductors, we are not even close to the thermal ampacity of the wire. Since we would be at the thermal ampacity at 30C, we can estimate that the conductor temperature will be 75C - (30C - 0F) = 75C - (30C - -17C) = 28C
The resistive component of the voltage drop will be:
(100A * 0.245 Ohm/kFT * 0.4kFT * [1 + 0.00323 * (28 - 75)] ) / 240V = 3.46%

So using Bob's example numbers, a #2 conductor would be sufficient for ampacity at 100F, but not sufficient for voltage drop. A #3 conductor would be sufficient for ampacity at 0F, but not sufficient for voltage drop.

But now let us figure the voltage drop of the #2 conductor in the 0F environment:
A #2 conductor (75C rating) carrying 100A is at its thermal ampacity at an ambient of 40C. This means that the conductor, in a 40C ambient, will be at 75C when carrying 100A. So in a -17C environment, we can expect a conductor temperature of 18C. The voltage drop is
(100A * 0.194 Ohm/kFt * 0.4kFt * [1 + 0.00323 * (18 - 75)]) / 240V = 2.64%

In this example, the temperature coefficient of resistance means a _significant_ difference in voltage drop, enough to mean the difference between needing to use a #1 conductor and a #2 conductor. However it seems pretty clear to me that the ampacity limits set by 310.16 are not changed by the temperature coefficient resistance of copper.

-Jon