Ok, what current can the supply source. and how much series resistance can your device cope with on its supply, you may be better off using a multimeter with a high voltage divider probe (correctly rated), over a high voltage resistance. essentially low side sensing, the correctly rated probe is just to cover all edge cases so people nearby are not showered in exploded parts of multimeter if it shorts a 5KV supply with some current drive when its input protection tries to clamp it.
A high voltage probe is generally 9 or 99 Megaohm to divide down the meters normal 1M to a smaller scale, e.g. the 9 Meg probe would make 5000V measure as 500V on the meter (Over 10 Megaohm) if for whatever reason your current sense resistor went open (it will need to not breakdown under the full supply voltage, but can fail from overheating, the first would appear as a short, the second would become an open circuit, fusing itself.)
so for 10mA lets say, you can probably afford to throw 20V of signal at the problem. which would be a 2000 ohm low side current shunt, 10mA is 20V, and only disspating 0.2W, you would likley build it out of something like 10 x 200 ohm resistors in series to get the voltage rating high enough.
You measure this signal through the divider probe giving your 2V for 10mA, 0.02V for 0.1mA, which should be inside the highest accuracy range for a multimeter. (its counts will generally line up with millivolts, so a 2500 count would roll to the next range at 2.5V.