Using 1kV range, zero off and internal triggering at the DMM and a 10 second gate time at the counter, I got the following results using the Scan Advance.
S0: 19.5 Hz, 51.3 ms / reading (51.3 ms / sample)
S1: 18.0 Hz, 55.6 ms / reading (27.8 ms / sample)
S2: 16.5 Hz, 60.6 ms / reading (15.2 ms / sample)
S3: 14.0 Hz, 71.4 ms / reading (8.9 ms / sample)
S4: 10.9 Hz, 91.7 ms / reading (5.7 ms / sample)
...
S10: 0.234 Hz, 4273 ms / reading (4.2 ms / sample)
S11: 0.117 Hz, 8547 ms / reading (4.2 ms / sample)
If you factor in the number of ADC samples per reading, you can see that we converge on an average of about 4.2 ms per sample from the ADC. From what SilverSolder has said, the meter takes 4 samples per mains cycle when in synchronous mode. That implies 4.17 ms per sample with a 60 Hz AC mains and correlates with my rough measurements.
So Fluke isn't completely full of BS. However, the meter appears to add a variable delay between groups of sequential readings used for the average. The question is, how, or even if, we can minimize this internal delay.
I assume the Controller averages the raw sequential ADC readings and then applies the range scaling, calibration corrections and possible math functions afterward. I doubt this process makes up for the observed delay.
Using 4.16... ms as the digitizing time required for each individual sample and multiplying this by the number of sequential samples for each mode, we obtain the total digitizing time. Subtracting this from the previous reading period, we obtain the following delays.
S0: 47.13 ms (51.3 - 4.16...*2^0)
S1: 47.27 ms (55.6 - 4.16...*2^1)
S2: 43.93 ms (60.6 - 4.16...*2^2)
S3: 38.07 ms (71.4 - 4.16...*2^3)
S4: 25.03 ms (91.7 - 4.16...*2^4)
...
S10: 6.33 ms (4273 - 4.16...*2^10)
S11: 13.67 ms (8547 - 4.16...*2^11)