The capacitor at the inverter to slow it down is likely needed for stability - depending on the main amplifiers phase reserve it may work without, but to be on the safe side the inverter should be at least 10 times slower than the main amplifier.
The pseudo-differential input trick is interesting but makes me a little nervous. To a first approximation, you get 2 readings, so double voltage and sqrt(2) noise... seems too good to be true.
The pseudo differential way does not even add much to the noise: the alternative classical method is alternating between the input and a zero reading. So the noise level would be about the same.
There is also another more indirect advantage on noise:
In the classical auto zero mode the input is only sampled half the times and this makes the input sensitive to noise (from the source and input amplifier) from frequencies around 25 Hz (with 20 ms integration). There is the possibility to filter this noise, but an analog low pass filter at some 10 Hz is slow. It gets even worse if the ADC runs at 10 PLC.
With pseudo differential sampling the input signal is essentially sampled all the time. So essentially no aliased 25 Hz noise from the input, though some 25 Hz noise from the inverter.
AFAIK some (if not most) sigma delta ADCs realize there differential inputs this way with 2 internal ADCs. So The idea is not that new.
I know there is a limited use for a 20 V range compared to a 10 V range. However if it comes at a low effort / cost, why not ?
There are a few complications with possible amps or 4 wire ohms ranges, but no as far as I have looked at it is not too bad.
The "too good to be true" part makes me worry a little, but there is still the fall back option switching the inverter to a ground buffer.
There is some limitations to non auto zero mode, but that is OK for me.