Ok I will bring one example, so to all talk about it .
We have an multimeter that claims measurements to AC up to 100kHz .
The wiki page estimates that the for frequency 100kHz needs 10 microseconds (uS) – cycle time .
And so the DMM needs 10 microseconds to wait for the signal to pass threw , just once.
The average Joe , will never measure AC at 100kHz , not even at 50 kHz .
And cares mostly about the display refresh times .
That is 4-5 times max per second in the modern hand-held DMM
The five times max per second display refresh rate, uses 200.000 (uS) just for one refresh time
And my poor brain tells me , that if the DMM takes four measurements at 100kHz as sample rate,
needs just 40 microseconds (uS) to do it.
If from the single refresh time of 200.000 (uS) will remove the 40 microseconds (uS) that the DMM needs so to do the job.
It has 199.960 (uS) as free time, so to smoke even a cigarette ..
I am trying to simplify things, so to gain an better understanding ,
I do not know if it works that way .