Very slow ADC settings have the problem that charge leaks out during the conversion, leading to errors. This should be in the datasheet somewhere. (I don't know exactly how this relates to this particular chip's settings.)
Another thing, which I find hard to understand why it would not apply to this ADC: every on-chip SAR ADC I have ever used was basically useless in its last 2 bits or so. This goes back to Hitachi microcontrollers c. 1990, and everything else I have used since. I always ended up doing say 100 conversions (SAR ADCs are very fast ) adding them up and dividing by 100, to get 10x less noise. It would be a miracle of engineering if this 12 bit ADC delivered a clean 12 bits, especially on any real PCB!
Surely you did 128, not 100, especially back then!
My experience with AVRs at least, has been good. The ATmegas (at least the older of the family?) have a 10 bit ADC, which is spec'd rather poorly (more like 8 bits, without calibration), but at least tends to be stable (only quantization noise). Their onboard reference is pitiful, more of a glorified diode junction than any kind of reference. The XMEGAs have 12 bit ADC (and also much faster, and multichannel or pipelined depending on model), and an onboard reference that's actually worth a damn (though certainly not 12 bits of damn, you need an external ref for that of course). I did a project recently with an ATXmega64D3 sampling at 200kS/s (8x oversampling, 25kS/s result), and found fairly stable LSBs there as well, even despite rather poor wiring (breakout board, fly wires everywhere). Which is actually a bit annoying as I'd actually like a little noise to get some dithering!
I have more indirect experience with STM32 but from what I've seen and used, they seem to be in a similar category as the XMEGA's. Again, give or take errata like glitches caused by CPU or other peripheral activity.
If you don't need a fast ADC, that's your own damn problem.
It doesn't hurt anything being there; at worst, you can always take a few samples and do whatever statistics on them you want, then disable the peripheral until it's time to sample again. And even that doesn't matter, if you don't need to conserve battery power or whatever.
I suppose the only downside is, you have to reserve a timer to set that up. But you're likely to be using one already for global timing purposes, or are likely to have extras, so it's not much of a problem most times.
OTOH, delta-sigma ADCs can be awesome. I have an ADS1118 and a MCP3550 within an inch or two of this ARM and their noise is amazingly low. The ADS1118, on 8 convs/sec, does a clean 16 bits, and the MCP3550 does a clean 20 bits. Of course they are very slow, but in so many applications, especially accurate ones, this is fine.
The downside is actually the interface. It seems, pervasively, across all brands -- there's always some stupid gotcha with them. It must be something like, a big mfg came to all of them (because 2nd sourcing) with the same spec, and that spec required an internal MCU to orchestrate all the functions or something, and so there's buggy software* involved, and who knows what kind of jank the bus interface (usually I2C or SPI) has?
*Well, firmware. Or, it's probably mask ROM... rigidware?!
One particularly braindead example was an ADI part (I don't remember the number), which never stops shifting. SPI frames aren't terminated by deasserting /CS. Nope, if you should ever accidentally get one clock too many, or too few, you have to
power cycle the miserable thing.I think there was some odd behavior with the ADS1220? But I wasn't involved in the software bring-up on that project, and it was a few years ago now. I remember they got it working eventually, and the sampling variance was in the ballpark of the datasheet values (despite putting an AFE in front), so I wasn't worried, it seems I did my job.
Tim