Hello all,
I'm designing a 350ohm strain gauge measurement (2 half bridges)
This is what I have so far:
- I have noisy environment and it's not important to output the absolute value of the measurement, so I was thinking of ratiometric conversion, rather than absolute
- Planning to use INA3333 (
https://www.ti.com/lit/gpn/INA333)
- Supply voltage for the ADC's REF voltage will be 3.3V (standard LDO, filtered)
- I'm planning to split the 3.3V supply to 1.65V and feed that to a INA's VREF AND E+ (split will be done using a opamp so low output impedance into the REF pin). Plan is to get 1.65V +/- 1.5V into the ADC.
- According to Figure 35 of the INA3333, they show "proper common mode voltage" resistor of 150 ohm (R1) connected in series to ground. Am I right to assume that I need to do 350/2 resistor here? I've seen some designs using 1.27k ohms here. Not sure what this resistor is for?
- I will not be able to calibrate it on power up, only when installed in the product for the first time, so some temperature compensation will be needed. Was thinking of adding a NTC to the uC board and just do error compensations/measurements/calibrations vs temperature once I have the circuit running. NTC will be in a place that will 100% correlate to the temperature of the strain gauges and wires. Am I over simplifying things here?
Would love your comments before I start.
Thanks