Hi everyone,
I hope you're all doing well.
I'm a graduate student working on building a voltage measurement circuit for one of our sensors. The sensor outputs a voltage ranging from -500 mV to +500 mV, depending on the concentration of certain ions present in the soil. It has a very high output impedance, in the range of several megaohms.
I've built a basic circuit using an INA333 instrumentation amplifier and an STM32L431 microcontroller. I'm using a REF2030 as the voltage reference, which provides dual outputs of 1.5V and 3V. For setting the gain, I used a 1% 100k ohm resistor. The circuit is powered by a 9V battery, and the ADC sample rate is set to 1Hz.
I tested the circuit with various voltages generated by a Keithley 2450 source meter, ranging from -500 mV to +500 mV. The measured output is within 1 mV of the source, which is acceptable for my application.
However, when I connect the sensor, the readings are inconsistent. At times, the circuit provides accurate results within a few millivolts, as verified with an HP 34401A multimeter. But frequently, it produces erratic and inaccurate readings. I can't determine the cause of this issue—whether it's related to the INAMP bias current, the ADC, or grounding.
One observation is that when I connect the circuit's ground to a small wire buried in the same soil as the sensor, the readings stabilize, but the voltage drops by 20-30 mV.
Any insights into what might be causing these problems would be greatly appreciated. Thanks in advance.