So at first it was inconsistent voltage under load, now I need to understand why different measuring methods of finding the current is inconsistent.
Let's start with the setup...
I am using an Allegro acs772LCB-050B current sensor. It is setup as a low side current sensing component. The system uses PWM to pulse some mosfets to open up and let the current pass to an external load. The total resistance of the load with all wiring included is about 110m
, the max voltage drop over the load will be about 8.4V, but when adding in IR of the batteries it will be less. But for just to say max, we could get a theoretical ~75A max current draw.
I am trying to verify the output of the ACS772 sensor. I've done couple of tests to try to verify this. First test was to put a Brymen BM 235 in line while only pulsing low values up to 10A according to my output. The BM 235 and my output were really close, within 0.1A up to 10A. Ok, that seemed good, but I wanted to test beyond the 10A mark, also I wanted to verify the results of that test with another DMM. I then hooked up my Siglent SDM3055, which is still in CAL, and got much lower values for the current. When I am outputting 10A, the siglent was measuring 6A. My guess on this is that the Brymen is doing a better job of filtering the PWM out. My sensor has a filter on the output with a cutoff of ~33Hz, and on my scope is a straight DC line.
Let's move on to more tests. The ACS772 has essentially a copper shunt in it. The 2 big legs have a resistance (data sheet ~100µΩ) measured at 67.2µΩ using a GOM-804. I hooked up test leads to both sides of the leg to 2 different DMMs (DMM6500 on the high side and SDM3055 on the low side). I then began pulsing the mosfets with different values and measured the voltage drop across the ACS772 sensor and compared the calculated current vs the output from the sensor.
At 10A on output on the sensor, my test was pretty close, within 4%. At 20A it was still about 4% off. I liked where this was going, I figured I could calibrate that out. But at 30A on my sensor output things got weird. So far the PMW values have a linear progression. 42 for 10A, 84 for 20A, so I went to 125 for ~30A. Made sense to me. The current sensor showed ~30A, but the math with the voltage drop came out to 40A. I then increased the PWM out to 150 (max is 255). The current sensor showed ~36A, and doing the voltage drop showed ~65A. The last result is not accurate, it doesn't fit the testing conditions as I am only at ~60% duty cycle with a PMW value of 150.
To verify that the temp co wasn't the issue, I redid the test @20A and I got the same results as the first time.
To sum up...
@10A output of the ACS772, the bymen agress within 1%. A calculation using ohms law agrees to 4%.
@20A output of the ACS772, ohms law calc agrees to 4%.
@30A output of the ACS772, ohms law says 39A
@36A output of the ACS772, ohms law says 65A
If the issue is PWM, which it probably is, why do I get usable values up to 20A?
I am thinking of modifying one of my boards and wiring up leads to the current sensor through an op amp as a difference amplifier and filtering the output to get rid of the pwm. This is working great for the voltage issue I had...
But ultimately, I am trying to build a calibration tool for calibrating my project.
Thanks