I bought a multimeter kit, thinking it'd be fun to build and anything would be better than what I already had. It didn't occur to me that I'd have to calibrate the thing, and now I'm stuck.
I figured I'd buy a DMM Check to get the resistance and volts ranges "close enough". While they have versions with current sources, they're more expensive, and are only in the milliamps range. I need to calibrate the 20A range on this meter, and I don't have anything like a precision current source.
I was thinking that once I got resistance and volts in the general area of calibrated, I could get one of those huge power resistors, stick it across my power supply, and apply Ohm's law to figure out how much current should be passing through.
Is that sensible, or are there ICs or something that will do a better job at low cost?