As far as I know, there are 3 ways to solve this problem:
1. regulator with remote sense => Current sense resistor => high gain differential amplifier => high resolution ADC (this is what this device seems to use). Depending on how you set up your remote sense, you can hide the voltage drop across the current sense resistor. Downside is that you need some sort of range switching to take place, and you can lose resolution on different ranges (aka limited dynamic range).
2. Energy metering by pumping "packets" of charge into a storage device and measuring how many packets you send. I haven't seen this in a real device, but it may exist?
3. Use a calibrated mosfet with a known V/I curve as a variable current sense resistor, but otherwise similar to option 1. I imagine it is quite complicated to get the feedback loop right, but it means that you can use a lower resolution ADC and as such a faster sample rate. I think the N6781A uses this approach with an 18 bit ADC @ 200khz to achieve an "effective" dynamic range of 28 bits (or so they claim). To be honest, I can't test it, because on paper this device is probably the most accurate in this type of instrument that I own. I suspect that the N6781A does low side current sensing so that mosfets are available with the lowest Rdson possible. One thing I can tell you is that for a 6V/1A supply, it uses some serious cooling in the various backplanes, so it must be generating a lot of heat.
Also, a lot of work needs to go into leakage. You're talking <1uA in some of these use cases and you need guard traces, shielding etc to minimise leakage. Note that in the photos of this device, the current sense region has lots of large holes around it to presumably reduce leakage and to reduce thermal coupling to the regulator and rest of the main PCB.