I imagine they do this digitally by taking samples as mentioned in my OP. I'm going to make an assumption here that the sample rate is going to be at least 2 order of magnitude too slow for my use (which isn't really that high speed anyway) of around 1 to 10x second whereas I need at least 1ms. For example all of my DMM, bench and mobile are WAAYY off and don't catch the MAX spikes or even the MINS that are much longer in duration.
That is not how multimeters work.
Except in special cases, the input is *integrated* over a whole number of power line cycles so it is already averaged over some integer multiple of 16.7 (60 Hz) or 20.0 (50 Hz) milliseconds. The reason for this is to produce nulls in the frequency response at the power line frequency and its harmonics which considerably helps normal and common mode rejection of power line interference.
Averaging mode if the meter supports it continuously averages these results.
But the actual response of the meter before integration extends to kHz or 10s of kHz and can be determined by looking up the AC frequency response. An AC input signal of 60Hz will show 0 when measured in DC mode but a DC pulse will produce the correct average within the AC measurement bandwidth. So for instance a 0 to 10 volt DC pulse with 50% duty cycle at 1kHz will measure 5 volts DC.
Measuring minimum or maximum values are limited by the integration time of each sample. Some meters use a faster than normal integration time or include separate peak detection to capture faster events.