It depends on the value. Up to 100M even a decent 6.5 digit multimeter, like Keysight 34465A, will give you 0.3% (1 year) accuracy and if you have a reference resistor with, say, 0.1% tolerance, you could improve on it. Above 100M things became increasingly difficult with the value increase. In my home lab (using Keithley 263) I can measure up to 100M with better than 0.2% accuracy, up to 1G better than 0.3% and up to 100G better than 1% (and can compare resistors to a much better accuracy, however my reference resistors for high values are accurate to 0.07% for 100M, 0.1% for 1G, 0.25% for 10G and 0.4% for 100G). My Keithley 617 electrometer can measure up to 200M with 0.3% accuracy (1 year) and up to 100 TOhm with about 3% accuracy (using V/I method).
Cheers
Alex
P.S. - I've just tried a direct measurement of 100M from K263 by HP3456A, the displayed value is 98.84M for the 99.007M reference , so about -0.25% error (specifications for 100M range is 2%) . Might be OK for comparative measurements though, however for my HP3456A, I know that the input leakage current is under 10pA, might not be the case with a different unit.