Hi, I'm looking to purchase a multimeter and need to measure current down to 1 microamp, but I'm not familiar with reading multimeter data sheets.
As an example I read the
datasheet for the EEVblog branded Brymen BM235. This is the section on current:
Does this mean it can read 600uA and higher?
What it means is that the most sensitive current range is 600uA.
It has a 6000 count display.
I.e. 0, 1, 2, ..5,999
So at the 600uA range, it can show 0.0uA up to 599.9uA.
At an accuracy of 1% + 3 digits
I.e. it can show (0.0uA) 0.1uA and more.
Display = Reading in uA's
000.1 = 0.1uA
123.4 = 123.4uA etc
Each of the values given are the maximum reading for a particular range. The way the values are presented indicate the number of digits displayed and where the decimal point will be.
If your measurement exceeds this number when you are in autoranging mode, the meter will step up to the next range up, increasing the limit and decreasing the absolute resolution. Likewise, if your measurement falls into the next lowest range, the meter will drop down to that range, giving you more resolution.
(Well - that's the principle. In reality, however, if you had (for example) a voltage that fluttered around 5.9v to 6.1v, you'd get pretty annoyed if your meter kept switching ranges all the time - so some hysteresis is introduced to reduce this problem. As an example, on my EEVBlog 235, if I'm measuring an increasing voltage starting from, say, 5v it will stay on the 6.000v range until it gets to somewhere around the 6.5v level, when it will change to the 60.00v range. Likewise, if I then decrease the voltage, it will not switch back to the 6.000v range until I get down around the 5.5v mark.)
Thanks a heap for your help that makes a heap more sense.