There's a lot of discussion around the forum on accuracy, calibration, inexpensive meters, and so on, so I thought it would be fun to look at one of my favorite inexpensive meters to see how it measures up. The meter in question is the venerable Radio Shack model 22-812.

I've posted before about why I like this meter (solid build quality, large clear display, computer interface for data logging). Here, I'm curious about how it might behave in terms of measurement accuracy. I have previously adjusted the RS meter against an independent voltage reference (not the 3456A). That was some months ago and I have not touched it since then.

The test I performed is somewhat narrow in scope, looking at DC volts over a moderate range typically encountered in low voltage circuits. I was mainly interested in the statistical picture that would emerge, so I did not look at other ranges like AC volts, current, or resistance.

To perform the test I put the 22-812 alongside an HP 3456A and compared readings with steady voltages generated by my TTi EL302P power supply. For each pair of readings I looked at the percent difference between them. I don't know the exact calibration status of the 3456A, but based on a few comparisons with other references it seems to be about right.

When considering the results, it is instructive to consider the best case accuracy that a 4000 count meter like the 22-812 can achieve. At full scale, the true value could be 3999.5 counts. The best the meter can indicate is either 3999 or 4000, either way it has a display error of 0.5 parts in 4000, or 0.0125%. This represents the best the meter could possibly achieve. At the lowest end of the scale the true value could be 400.5 and the meter could indicate 400 or 401. In this case the display error is ten times greater at 0.125%.

So if the meter keeps its readings within 0.125% of the expected value we can say it is doing as well as it can.

Plotted graphically, the results look like this:

The mean signed error is -0.007% with a standard deviation of 0.035%. The mean absolute error is 0.027% with a standard deviation of 0.025%.

The worst error is about 0.1% with the readings mostly falling within 0.05%.

It is interesting to see a kind of cyclic S pattern to the errors, suggesting a degree of non-linearity in the instrument.

In case anyone is curious the full table of data is shown below.

Voltage readings compared between RS 22-812 (adjusted) and HP 3456AHP | RS |

0.1024 | 0.1023 |

0.1996 | 0.1995 |

0.3007 | 0.3004 |

0.3978 | 0.3976 |

0.4987 | 0.499 |

0.5975 | 0.598 |

0.6979 | 0.698 |

0.7967 | 0.797 |

0.8975 | 0.898 |

0.9957 | 0.996 |

1.9991 | 2.000 |

2.9984 | 3.000 |

3.9999 | 4.00 |

4.9973 | 5.00 |

6.0018 | 6.00 |

7.0058 | 7.00 |

8.0101 | 8.01 |

9.0105 | 9.01 |

10.0062 | 10.00 |

11.0079 | 11.01 |

12.008 | 12.01 |

13.004 | 13.00 |

14.003 | 14.00 |

15.009 | 15.01 |

16.013 | 16.01 |

17.013 | 17.01 |

18.015 | 18.02 |

19.011 | 19.01 |

20.019 | 20.02 |

21.020 | 21.02 |

22.020 | 22.02 |

23.026 | 23.03 |

24.029 | 24.03 |

25.030 | 25.03 |

26.027 | 26.03 |

27.029 | 27.03 |

28.028 | 28.03 |

29.023 | 29.02 |

30.029 | 30.03 |