Good callout. I was basing my figures above on 240 assuming the inverters were going to be split phasing. If the run is 120, lower voltage = more amps. The calculations would need to be for 86 amps for 120You have it set to 120 and I have it set to 240 but it's sort of doing my head in right now to justify why or if we get to use the 240v calculation for feeder voltage drop even though they can carry 120v currents at full amperage. Maybe we just don't typically expect that much of an imbalance so we tolerate higher potential voltage drop in that scenario.
Wait for @timselectric or another expert to clarify here.
edit: also one less wire at 120