I tested the Fluke 113 on a large servo drive, I had 350VDC on it. (unknown capacitance)
Measured with a second high impedance meter the servos voltage dropped very slowly after I turned it off.
I then connected the Fluke 113 and it dropped quite quickly down to 300-ish volts.
Then it slowed down allot, so the PTC's did their job...
I next tried another servo, but where I knew it's capacitance was in the 1000µF range, charged to 325VDC.
On it's own it discharged to 10v in a little over 5 minutes, with the Fluke 113 that took only 12 seconds.
It's totally off topic but still one thing confuses me with this:
How can the meter read the voltage correctly when the PTC does it's job?
Does it not mean that it changes the input resistance from 3kΩ to something allot higher?
EEVblog #373 goes into this, the link goes to 27:23 where Dave discusses the 1M hybrid resistor.
Is this how the meter can measure the voltage even if the PTC's go up in value? (As they are not in the same path).
If that is so how big a reading error can I expect when the Fluke SV225 adapters PTC's have become highly resistant, or will it not matter? (If so then why?)