Author Topic: STM32F373 SDADC measure Vref_int?  (Read 4675 times)

0 Members and 1 Guest are viewing this topic.

Offline gnasiratorTopic starter

  • Contributor
  • Posts: 35
  • Country: de
STM32F373 SDADC measure Vref_int?
« on: February 18, 2016, 01:06:35 am »
Hi,
is someone of you familiar with the STM32F373?

I would like to measure the internal, calibrated reference voltage using the sigma-delta adc. As reference, I'd use either VCC or an external reference.
Judging from the datasheet, it seems as if only the SAR ADC (12 bit) has Vref_int connected to its mux and there is no pin where I could get the voltage from.

Here is my post about this in the ST forum.

The reason why I like it is because the internal reference is calibrated. So I can read that value and basically only have to trust my one external voltage reference's accuracy. I could build a resistor divider but that's two extra components and with reasonable accuracy, they're getting expensive, too.

Any thoughts?
Kiwi by heart. But never liked Vegemite.
 

Offline michaeliv

  • Frequent Contributor
  • **
  • Posts: 260
Re: STM32F373 SDADC measure Vref_int?
« Reply #1 on: February 18, 2016, 05:40:44 pm »
It's not very clear what you're trying to do / why are you trying to do it, maybe you can re-phrase it.
The way I read it after reading both your threads:
On a board you have the F373 and a 3v reference. Using the 1.2v reference you can get accurate readings but you want to use the external 3v reference. The 3v reference does not give accurate readings since it's not calibrated. Therefore you want to calibrate the 3v reference using the calibrated 1.2v reference. To do that you need to connect 3v as chip ADC reference and 1.2v as analog input. And this is what you're not able to do.
I'm assuming you can't use an external test/calibration jig at production in order to calibrate the 3v reference ?
Well I don't know how to do what you want but if i'dd had to guess the 1.2v is calibrated using inaccessible calibration values, not trimmed. That means that when you select the 1.2v reference as ADC reference, the chip internally reads calibration values and does some math. That means that even you if had access to the 1.2v it's not actually 1.2v .. so you would need the calibration values to do anything useful.
I haven't read through the datasheet of you chip, have you considered this ?
 

Offline gnasiratorTopic starter

  • Contributor
  • Posts: 35
  • Country: de
Re: STM32F373 SDADC measure Vref_int?
« Reply #2 on: February 19, 2016, 09:22:10 am »
Uh oh, sorry it seems I didn't make it clear enough, what I want.

I'll try rephrasing it:
The SDADC comes with a programmable gain amplifier stage after the MUX. That PGA has an offset and a gain. The offset is automatically calibrated during the SDADC init procedure and the gain is configurable to 0.5, 1, 2, 4, 8 and so on.
According to the datasheet, the gain isn't very exact, though. There is a typical gain error of -2.7%. And this is what I try to calibrate. I want to measure the gain error.
To do so, I use the very precise external 3V reference as reference for the SDADC. After doing offset calibration all that's left to do is measure another known voltage which is as close to the reference voltage as possible. By comparing the ADC reading to the actual (known) voltage, I can compute the real gain and then use that to do all following measurements with very high precision. This step is repeatable to account for temperature drift etc.

And for this gain measurement I want to use the internal, calibrated reference voltage as I presume it's going to be accurate enough due to it being calibrated and so I can spare myself from having to buy two precision resistors to get a second voltage.
Kiwi by heart. But never liked Vegemite.
 

Offline michaeliv

  • Frequent Contributor
  • **
  • Posts: 260
Re: STM32F373 SDADC measure Vref_int?
« Reply #3 on: February 19, 2016, 10:02:55 am »
[...] the internal, calibrated reference voltage as I presume it's going to be accurate enough [...]
I think you are mistaken here. I would say that it's very likely that the reference is calibrated, not trimmed ( trim = expensive ).
That basically means that the reference generates 1.1v-1.3v instead of 1.2v. However when you select it as ADC input the ADC has factory-programmed calibration data that it uses to scale the final reading appropriately. You would not have access to this factory-programmed data. You wouldn't even know it exists. All you see is an accurate reading. So even if you could access the internal reference it wouldn't be 1.2v. It would be 1.2v when scaled with the factory programmed data that is inaccessible. Is this making sense ?
Your initial idea of dividing the 3v voltage should work. What gain factor are you trying to calibrate ? Silly idea : If it's the 2x you don't need precision resistors, just 3 standard resistors. I see that you have differential inputs in the SDADC so you can divide the 3v into 3 voltages with the standard resistors, measure each, add them up then see the % off from the expected 3v. You can do the same with other gains but you need more resistors.
« Last Edit: February 19, 2016, 10:05:03 am by michaeliv »
 

Offline gnasiratorTopic starter

  • Contributor
  • Posts: 35
  • Country: de
Re: STM32F373 SDADC measure Vref_int?
« Reply #4 on: February 20, 2016, 02:48:08 am »
[...] the internal, calibrated reference voltage as I presume it's going to be accurate enough [...]
I think you are mistaken here. I would say that it's very likely that the reference is calibrated, not trimmed ( trim = expensive ).
That basically means that the reference generates 1.1v-1.3v instead of 1.2v. However when you select it as ADC input the ADC has factory-programmed calibration data that it uses to scale the final reading appropriately. You would not have access to this factory-programmed data. You wouldn't even know it exists. All you see is an accurate reading. So even if you could access the internal reference it wouldn't be 1.2v. It would be 1.2v when scaled with the factory programmed data that is inaccessible. Is this making sense ?
Your initial idea of dividing the 3v voltage should work. What gain factor are you trying to calibrate ? Silly idea : If it's the 2x you don't need precision resistors, just 3 standard resistors. I see that you have differential inputs in the SDADC so you can divide the 3v into 3 voltages with the standard resistors, measure each, add them up then see the % off from the expected 3v. You can do the same with other gains but you need more resistors.

Hi, thanks for your reply!
I hate to contradict you, but it seems the calibration data is freely accessible from a read-only register. At least according to the reference manual p. 204. (see attached pic).
The SAR ADC might be able to do a complete auto-calibration, though. There you're right. But I'd rather like to use the SDADC for its higher resolution. Unfortunately, that one only does auto offset calibration and doesn't touch gains at all. So to calibrate gain, I'd like to use the internal reference together with its calibration data.
But to do so, I need to measure it and I haven't found a way to do it yet.

It's kind a weird that the higher resolution ADC doesn't provide the same ease of calibration as the other one.

I would usually use x1 gain. Maybe x2 or x4, depending on input levels. Because of the VCC powered buffer amp I'm using before, I don't need any gain <1 as the output voltage can't swing too high anyway.
Kiwi by heart. But never liked Vegemite.
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf