EEVblog Electronics Community Forum
Electronics => Beginners => Topic started by: paulca on August 16, 2024, 05:13:45 pm
-
I have a pressure transducer.
100PSI 5V Output signal 0.5V - 4.5V Linear.
I have it attached to an ADS1115 and it reads 0.457V with the transducer "open" to ambient. My meter reads it as 0.456V so the ADC is playing properly.
So when I run the offset and conversion to PSI I get minus -0.9PSI.
PSI = (V-0.5) * (4/100)
The current ambient pressure in my garden about 12feet below me is 1005hPA. This is not 1PSI below standard.
So something is not adding up.
I don't think the sensor supports negative pressure. I think it's realtive to current ambient. So the attention then focuses on the 0.456V reading. Is that meant to be 0.5V and it's just "off"?
The other factor is that the sensor accuracy is only 2% across the range. So technically it can be out by up to 2PSI.
Should I provide a "Zero" function on the device to take the current ADC reading as "0 PSI" and offset by that rather than the static 0.5V?
-
Could you give us a link to the transducer? It'll be useful to see the data sheet.
-
No schematic? No answer.
-
You're needs to apply temperature correction to ADC result. Because pressure sensor is very sensitive to it's temperature. It requires to have calibration table which is obtained during calibration procedure. It needs to do calibration measurements at specific temperature points. Then you can convert it to pressure. But you can combine it with calibration, so the result of termo-compensated value will be already in pressure units.
Also note that pressure sensor is also sensitive to supplying voltage, so it's better to use relative measurement - when sensor supplying voltage is feed to ADC REF and sensor output is feed to ADC IN
-
You have a Ratiometric pressure sensor so ensure you use a regulated power supply.
https://www.eastsensor.com/blog/0-5-4-5v-pressure-sensor/ (https://www.eastsensor.com/blog/0-5-4-5v-pressure-sensor/)
-
https://www.ebay.co.uk/itm/166056288795 (https://www.ebay.co.uk/itm/166056288795)
I did get a datasheet for it, or at least something of that "style", but I can't find it now.
I think Poroit hit the nail on the head. My lab bench USB ports are on a 5m + 3m hub chain. Unpowered. So "USBV+" is about.... 4.7V
"Ratiometric", aka the 0.5V-4.5V signal scale is proportional to VCC. If I give it 4.7V, and 0PSI, it is likely going to give me, 0.47V. I also expect at full 100PSI it would give me something like 4.47V, my application has no interest up there.
It does make the project slightly harder though. The ADC on 3.3V and the MCU on 3.3V cannot accurately measure their own Vcc, nor can they measure the 5V rail for the sensor or USB.
I think this would be easier to solve way, way up the stack in some python code to handle calibration with a UI.
The temperature deviation is in the sellers blurb though:
Temperature Effect on Zero: Typical:0.02%FS/℃; Maximum:0.05%FS/℃
Temperature Effect on Sensitivity: Typical: 0.02%FS/℃; Maximum: 0.05%FS/℃
I can't find it's calibration nominal temp, but assuming 20*C those error rates will not bother my application range of 3*C-30*C.
EDIT: I could just put a regulated supply on it instead :)
-
I can't find it's calibration nominal temp, but assuming 20*C those error rates will not bother my application range of 3*C-30*C.
I'm not sure if your sensor has temperature compensation circuit, but bare piezoresistive pressure sensors are very sensitive to temperature, you can see difference even at 1°C change. If you use it within 30°C temperature range the result will be non usable due to temperature dependency.
-
(Deleted)