Author Topic: Ideas for modify 5v current sensor output please.  (Read 1172 times)

0 Members and 1 Guest are viewing this topic.

Offline PeterPerkinsTopic starter

  • Newbie
  • Posts: 1
  • Country: gb
Ideas for modify 5v current sensor output please.
« on: December 13, 2016, 06:41:41 pm »
Hi all.

I have a generic 5v  supply rail to rail 200A current sensor with a 2.5v = 0 amps output.

0v=-100A and 5V = +100A  standard stuff. :)

The sensor output has a 220R resistor in the circuit inside the sensor.
It feeds into a generic microcontroller adc input so probably has an input impedance of 10k or so.

I want to reduce the sensitivity of the sensor so that I can electronically adjust the sensitivity to give between a say 0 and +50% current increase.

Bypassing it with some of the current is not practicable.

Using  two simple 1k resistors tied to the sensor +5v and gnd rails and to the sensor output I can reduce the voltage swings and get and increase in current of around +30% .. All good so far. 

However I want to electronically control the modifying rail tied resistors  to give say a 256 step change.

Do you think spi digital pots can do the job?

Any better ideas?

Do I need op amp buffers to keep the impedance?

Thanks for the ideas.   
 

Offline ajb

  • Super Contributor
  • ***
  • Posts: 2608
  • Country: us
Re: Ideas for modify 5v current sensor output please.
« Reply #1 on: December 13, 2016, 07:25:36 pm »
Do you think spi digital pots can do the job?

Sure, that's probably the way to go as long as your current sensor can operate safely at those current ranges and is capable of swinging it's output beyond 0V and +5V.  Also watch that you don't violate the input range on whatever digipot you select.

Quote
Do I need op amp buffers to keep the impedance?

When the MCU samples that ADC channel, there's an input switch that connects the ADC's sampling capacitor to that input pin, and the capacitor will be charged/discharged via the total input impedance (both internal to the MCU and the output impedance of whatever circuitry is connected to the input).  At the end of the sampling period, the input pin is disconnected and the ADC performs the conversion.  If the sampling period is too short or the input impedance is too high, the cap won't wind up quite equal to the nominal input voltage, and the resulting ADC reading will be off.  So there's a direct relationship between the impedance at the ADC input and the required sampling time to achieve a given accuracy.  Increasing the sampling time can compensate for a higher impedance, but of course this increases the total conversion time, which may or may not be a problem.  If you consult the MCU's datasheet, you should see some information on the ADC input characteristics, and if you're lucky they may even give you a formula for calculating the maximum input impedance for a given sample time/resolution.
« Last Edit: December 13, 2016, 07:27:42 pm by ajb »
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf