IMHO the above determination _can't_ be made accurately. It is a design issue, one which will depend as much upon the taste of the home-owner as physics. Part of what you are going to have to do is educate the home-owner on the problem of voltage drop, and explain that the issue will range from problematic, to safe but esthetically disturbing, to now a problem at all but expensive to implement, and offer a range of choices.
1) You have to determine what 'acceptable voltage drop' is. The NEC suggests a _total_ of 5%, with 3% for feeders, but note that this is a suggestion in a FPN, not a requirement or a guarantee of success. If every appliance in the house used 'universal switching power supplies' (such as are on laptop computers), then a 15% voltage drop might be acceptable. On the other hand, a 5% drop in voltage to an incandescent lamp means a 17% drop in light output, which might be disturbing, especially if it were sudden and momentary.
2) To determine the expected 'normal' voltage drop, I would use the smaller of the calculated demand values. Often the NEC estimates pretty high in terms of actual demand.
3) To determine the expected _worst case_ voltage drop, I would find the largest momentary load, eg the motor with the biggest starting current, and add the starting current to the demand load. Motor starting is probably going to cause most of the problems with things like flickering lights. I am presuming that with this big long run for electric, water comes from a well. I would be on the lookout for significant voltage drop each time the pump starts.
4) It may be cheaper to deal with the momentary loads by changing the load characteristics (eg. using VFDs or soft starters) rather than increasng the size of the service conductors.
5) It may pay to design for the future by doing the install in conduit, so that if the lights do flicker too much, the wire can be increased in size without the cost of digging.
And finally: the above is all about design theory, not my personal installation experience
-Jon