It says in the keysight U1731C/U1732C/U1733C manual that accuracy is given as ±(% of reading + counts of least significant digit. I don't know what to make of the 'counts of least significant digit'.
I get that the least significant digit would be the rightmost,Exactly right.
but what exactly is the count of that digit?1 - more specifically, any change in that value by 1.
Is higher count better?Absolutely not. It is a measure of error. You want this value to be as low as possible.
QuoteIs higher count better?Absolutely not. It is a measure of error. You want this value to be as low as possible.
The U1733C is a 20 000 count LCR meter, with a resistance measurement accuracy on the 2 Ohm range of 0.7% + 50
That means +- (0.7% + 0.005), i.e. a "count" is just a number (counting). A lower count is better.
If you're measuring 1.9 ohm exactly, your meter is guaranteed to display a value between 1.9183 and 1.8817 ohm.
1.9 +- (0.7% of 1.9 + 0.005) = 1.9 +- 0.0183
| Count of meter | 2 | 0 | 0 | 0 | 0 |
| Error on count | 0 | 0 | 0 | 5 | 0 |
| Measurement range = 2 ohm | |||||
| Measurement maximum | 1. | 9 | 9 | 9 | 9 |
| Error on Measurement | 0. | 0 | 0 | 5 | 0 |