Author Topic: MCP3421 18-bit ADC accuracy  (Read 3459 times)

0 Members and 1 Guest are viewing this topic.

Offline splinTopic starter

  • Frequent Contributor
  • **
  • Posts: 999
  • Country: gb
MCP3421 18-bit ADC accuracy
« on: November 24, 2015, 05:53:25 pm »
The Microchip MCP3421 is a low cost, small, 18bit serial ADC with built in voltage reference. Problem is the datasheet is very sparse when it comes to min and max accuracy specs. Headline specs says voltage reference accuracy is 2.048V +/- .05% which looks like a useful number; but in the actual characteristics it only provides a typical value so the +/- .05% presumably doesn't mean much.

They specify a  gain error of .05% typical, 0.35% maximum over -40C to +85C. 0.35% is equivalent to 1/286 or just a bit over 8 bits so it's going to need calibrating in most cases otherwise there's not much point using an 18bit converter. The problem then is knowing how the gain (including voltage reference drift) and offset vary with temperature. The datasheet only provides a typical gain error drift with temperature of 15ppm/C. So if you were using this part what overall accuracy figure would you specify to your customer?

Characterising them yourself won't guarantee that production parts aren't substantially different and production testing probably wouldn't be worthwhile for such a low cost part. So would you take the datasheet typical figures and apply an n-sigma guardband? What factor for 'n' would you use? Another problem is that the typical figures don't seem to be very reliable. The spreadsheet I've attached takes the PGA=1 gain error vs temperature (typical) figures from figure 2-6 and calculates the drift using both the gradient and box methods.

1) The datasheet says typical gain error is .05% (500ppm), -40C to +85C. The graph however shows the typical error varies from + 0.094% to -0.134%. So what is the .05% figure supposed to mean?
2) The typical drift is specified as 15ppm/C, again from -40C to +85C, but fig 2-6 shows values between -45 to +32ppm (or -31 to +21ppm using the box method). So again inconsistent.

So what figures would you use? Assuming it is calibrated at 22C, from fig 2-6 the typical gain error would vary from +0.22% to -.008%. Which actually isn't that far from the 0.35% maximum gain error specified - certainly less than 3-sigma. So 8-bit accuracy even when calibrated? (And that's ignoring the linearity, offset and error due to Vcc variations). Over a more restricted temperature range its not quite so bad but even so it might make more sense to use a microcontroller's 10 or 12 bit ADC and spend a bit more on a low drift voltage regulator or a separate reference.

Have I made a silly mistake?
 

Offline diyaudio

  • Frequent Contributor
  • **
  • !
  • Posts: 683
  • Country: za
Re: MCP3421 18-bit ADC accuracy
« Reply #1 on: November 24, 2015, 09:56:46 pm »
Microchip isn't really a player in ADC innovation, I would start looking at vendors like Analogue Devices or Maxim Semiconductor. Microchip has a reputation for skimpy ADC specifications. 

 

Offline splinTopic starter

  • Frequent Contributor
  • **
  • Posts: 999
  • Country: gb
Re: MCP3421 18-bit ADC accuracy
« Reply #2 on: November 25, 2015, 03:21:43 pm »
Microchip isn't really a player in ADC innovation, I would start looking at vendors like Analogue Devices or Maxim Semiconductor. Microchip has a reputation for skimpy ADC specifications.

That's true - its almost always possible to use better specified, but more expensive parts, but that misses the point. Engineering is about trading off device costs, performance, development costs etc. to meet requirements.  Hundreds of thousands and probably many millions of these devices must be sold every year so the engineers who design them in must have (mostly) made a judgment as to whether its accuracy was acceptable, though sometimes they'll get it wrong - such is the nature of dealing in uncertainties.

My question is how would you go about creating an error budget when given a datasheet like this? Its not an uncommon situation when dealing with low cost parts for cost sensitive products. Clearly you wouldn't use it for something that demands five-nines guarantees but it would probably be perfectly ok for, for example, toys where relatively high failure rates at the limits of temperature range and manufacturing tolerances is acceptable. But even there your boss is likely to want to know what percentage are likely to get sent back by customers.

Perhaps my biggest misgivings relate to the seemingly common mismatches between typical figures shown in the electrical characteristics and the typical performance charts. Sometimes you can work out they are simple mistakes and sometimes due to differences in test conditions but often I am unable to reconcile the disparities. At that point you wonder if the figures are remotely realistic especially given that the manufacture won't be bound by them as they are only 'typical'. So do you largely ignore the datasheet and measure as many devices as you can reasonably do and preferably from different batches? And then what - how do you decide how big to make your guard-bands not being privy to the manufacturer's test data and statistics?

If you have enough clout with the supplier then you probably demand the data you need but small businesses generally can't. It clearly helps to have some experience/knowledge of the underlying factors - eg. fet gate leakage may vary over orders of magnitude between individual parts, whereas PN junction voltage/temperature characteristics are fairly well defined by the underlying physics. But, for example, with devices like this ADC its unlikely that you are going to know how the voltage reference is implemented - buried zener or bandgap so can't guess as to its performance.

For what little its worth, I note that these ADCs are used in Chinese 5 digit meters like this: http://www.ebay.com/itm/Red-0-36-LED-5-Digit-DC-0-33-000V-Digital-Voltmeter-Voltage-Meter-Car-Panel-M42-/301724351834?hash=item46402c3d5a:g:zg8AAOSwF1dUNijs

These are typically spec'd at .3% + 2 counts, 25ppm/C. Chinese specs are notorious and those probably account for the input divider resistors alone let alone the ADC. However its quite possible they do meet these specs in practice - I've got a few and they seem to perform very well (within .03% of my 34401A at room temperature) but are probably calibrated at room temperature. Unfortunately most engineers don't have the luxury of being able to invent specifications.

Apologies if this has gone on a bit but I'm sure there must be many designers here that have had this problem so what do you do? How often is it a case of a wing and a prayer?
 

Offline splinTopic starter

  • Frequent Contributor
  • **
  • Posts: 999
  • Country: gb
Re: MCP3421 18-bit ADC accuracy
« Reply #3 on: November 25, 2015, 03:58:07 pm »
Talk about coincidences - within 5 minutes of writing my previous post I was following a link (by Dr Frank again!!!) that I came across an hour ago in the Keithley DMM7510 teardown thread about using 2N3904 transistors for op-amp input protection to the AoE page 294-295, chap 5.2.2.

There on the opposite page, 5.3, is "The lessons: error budget, unspecified parameters" which covers pretty much exactly the questions I was asking including missing or poorly specified parameters in datasheets. It also includes: worse case numbers meaning "I don't want to test this parameter, so I'll put a conservative guess in the datasheet"
 

Offline Kleinstein

  • Super Contributor
  • ***
  • Posts: 14795
  • Country: de
Re: MCP3421 18-bit ADC accuracy
« Reply #4 on: November 25, 2015, 04:52:48 pm »
With these small SOT23 reference inside the chip, there is the problem that soldering the chip (especially lead free) can have an influence on the reference part. So even if testet as a bare IC, the soldered chip might behave different.
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf