Hi there!
I want to make a simple voltage measurement device myself which can:
1. Measure a DC voltage range from 100-200V
2. Have reasonable accuracy, say error less than 0.1%
I thought a cheap multimeter may give me some idea so I tore down my multimeter, I expected that there would be an AD converter like chip, but the only thing I found is a single central chip(most likely a microcontroller) on the board.(you can find my photos in the attachment)
Here is my question:
1. What is the standard way to measure voltage in digital multimeter?
2. How should I deal with the high DC voltage (because usually microcontroller can't afford that high voltage input)
3. Is it better to use a seperate AD converter than using microcontroller's build-in AD converter?
Thank you all!