Hey guys as the post sugests im trying to convert a signal from 1-5V to 0-2V i it possibe using a single supply
Circuit details
+24v single supply
+5v Single supply
LM324 Op amps
How accurate do you need the conversion?
What is the signal? i.e. Do you need to accurately reproduce it all down, or are you concerned with steady state, like with logic signal?
One quick and dirty solution is to use a differential applifier configuration, (see
http://www.electronics-tutorials.ws/opamp/opamp_5.html). Tie the V1 (smaller) input to a 1V reference and then configure your differential amplifier to give you a gain of 1/2. Output will be (5-1)*1/2 = 2V max and (1-1)*1/2 = 0V min. How applicable this will be will depend on what your signal actually is, how fast it is, etc.
Depending on your application your reference can be a suitably stiff resistive divider, a high impeedance divider buffered by a spare opamp in the LM324 or an actual voltage reference. If your signal is high impeedance, look into an instrumentation amplifier, which is more or less the same principle with an additional buffer stage at the front end.
and for the digital approach, an arduino sensing an input pin and creating an output via a dac of some sort
then its just a table lookup or mapping problem.
accuracy would be good im sensing a 4-20ma output signal then measuring the VD across a sense resister=250R im then using a voltage divider across that resistor into an opamp at the moment.
It then will go into a microcontroller im going to be using a 2.048v refrence for the micro
and for the digital approach, an arduino sensing an input pin and creating an output via a dac of some sort
then its just a table lookup or mapping problem.
Or you can even buy a National instruments rack, filled with an analog input and a analog output card, and program it in Labview, and run the program on a reliable IBM server
Differential amplifier and voltage divider is definitely the way to solve the problem.
But. If you are going to use an ADC, I would use a simpler. If you would use a smaller resistor, say 100R, then your output signal is 0.4V-2V and (after the amplifier) route this to your ADC directly. The ADC has better linearity, and accuracy than your resistor divider, unless you make it from 0.05% resistors, so the total accuracy is higher.
using the 100 ohm method also allows you to do things when over/under range, something that almost every commercial controller does (usually +- 3%) equally you can also tell when there is no sensor / no power to the sensor
Ok thanks for the replies got another problem when i was measuring with my meter across the 250 resistor i was getting 1-5v ouput but now putting in the voltage divider across the resister measuring voltage at the wiper im getting 23v and 19v obviously my supply voltage 24v minus the volt drop idiot (DOH!!!) so how would i go converting that to 1-5v or something similar ive attched a schematic below cheers