As the computer on 'Star Trek' says: "Insufficient Data for Meaningful Response".
The problem is that we don't know how the load will respond to the change in voltage. If these are a bunch of lightly loaded motors, or resistance heaters, then increasing the voltage might result in _increased_ current flow to the load. On the other hand, if these are heavily loaded motors, or thermostat controlled resistance heaters, or other _constant power_ loads, then increasing the voltage means an automatic reduction in current flow. If these are motors, then increasing the supply voltage will mean reducing the 'load current' but increasing the 'magnetizing current' (reducing the power factor).
For the purpose of calculation, we have to assume something, so I am going to assume a simple constant power load; increasing the voltage means reducing the current and the same total VA supplied to the load.
The resistance of 500 MCM conductor at 25C is 0.0129 ohm per 500 feet. The voltage drop in each conductor is 354A*0.0129 Ohm= 4.57V, and the voltage delivered to the load is 424-(root3 * 4.57) = 416V. The VA delivered to the load is 416*354*root3=255kVA. The power dissipated in the wires is 4.57*354*3 = 4850W
Now we change things so that we have 476V delivered to the load (I am guessing here, rather than solving the equation) so we get 255kVA/root3/476=309A. Check the voltage drop and we get 309 * 0.0129 = 3.98V The power dissipated in the wires is now 3700W. So you end up saving about 1150W. 8766 hours in a year, so you end up saving about 10000 kWh per year. So I guess a savings of about $1500 per year.