Author Topic: Should I Be Using a Constant Current or Constant Voltage Device for My Project?  (Read 2043 times)

0 Members and 1 Guest are viewing this topic.

Offline PotomacTopic starter

  • Regular Contributor
  • *
  • Posts: 71
  • Country: us
I thought this might be a better question for the "Projects, Designs, and Technical Stuff" subforum, rather than Beginners subforum, where I've already posted one question about my project.  It's going to be a bit of a more advanced question than the questions in the Beginner forum

My question is whether I should be using a Constant Current or Constant Voltage Device for my project.  Or maybe I need both for different sub-components of the project?

Let me explain my project in more detail below

I'm testing 12 (soon to be 16) IR transmissive slot sensors to see how consistent the data is across a batch of them.  They are the EE-SX1140 model from Omron. (Link here: http://www.omron.com/ecb/products/pdf/en-ee_sx1140.pdf)  On one side, there is an IR LED ("emitter").  On the other side, is a phototransitor ("detector"). 

The Omron sensors are usually used in vending and automotive applications as binary "on/off" switches.  However, I'm taking analog readings off of them using a 1k Ohm resistor.  I'm trying to using them to distinguish between materials of different transparencies. When a semi-transparent material in place, a voltage is output that is proportionate to how much light is transmitted across the slot. In the picture way at the bottom of this post, you can see the yellow wires going into the analog inputs of my Arduino. 

My main goal here is to get each of the 16 sensors to register approximately the same ADC values on my Arduino for the same transparent object, even with the manufacturing tolerances in the LEDs/phototransistors and resistors. (Note: I've gone ahead and gotten resistors with 0.1% tolerance. Also, I just finished some mounts that go on top of the sensors to position each object in exactly the same place on each sensor)

To get clean data, I realize that I can't just use a unregulated power supply.  It's apparent that there will be voltage and current spikes/dips that will affect the readings

I think I will have to use a Constant Current device to drive the LED emitters. (Since the amount of light they output is dependent on current)  I think I will have to use a Constant Voltage device on the phototransistor detector side since I want to have the same voltage (5.0V) coming in the Collector pin of each BJT. 

Can you guys let me know if these assumptions are right? If not, can you walk me through any misconceptions I may have? Would be greatly appreciated.

I have tentatively selected these Constant Current and Constant Voltage devices.  Any better suggestions would also be great

Voltage:  Sparkfun PRT-13032
https://www.sparkfun.com/products/13032

•6-12V input voltage via barrel jack or 2-pin header
•3.3V or 5V regulated output voltage
•800mA Operating Current



Current:  LDD-300H

http://www.mouser.com/ProductDetail/Mean-Well/LDD-300H/?qs=sGAEpiMZZMt5PRBMPTWcaTWeSxpmncu0STjW5VhFefI%3d





Here's a pic of my project at he moment that I posted for the other thread

https://www.eevblog.com/forum/beginners/question-about-ground-wire-on-ac-dc-power-supply-from-mean-well-inc/

I'm using a 191 Ohm/0.1% resistor for the LED's, and a 1k Ohm/0.1% resistor for the detectors

Everything is on hex standoffs and perf boards for ease of removal.














« Last Edit: May 25, 2016, 02:01:03 am by Potomac »
 

Offline Mastrofski

  • Contributor
  • Posts: 16
  • Country: us
The LEDs will need to receive constant current. Whether that comes from a constant current driver itself or from a voltage fed into a current regulator makes no difference, provided the constant voltage source can supply the forward voltage of the LED. I work in the lighting industry, and even on our light engines, you'll often find a bunch of LM317s scattered around. Use whichever is easier for you implement. I'd probably go constant voltage and convert that to a current source given that you already have a constant voltage going to other parts of your circuit.
 

Offline danadak

  • Super Contributor
  • ***
  • Posts: 1875
  • Country: us
  • Reactor Operator SSN-583, Retired EE
One way of calibrating a design for analog variation is during manufacture.

Basically you have a cal routine in DUT, and have DUT communicate to an
external high precision stimulus generator. The DUT commands the stim generator
during cal to output a value, and then measures the signal chain output, essentially
the signal + errors thru the signal chain. It then saves measured vs stim that allows
a least squares or high order polynomial equation to be created for future readings
when DUT is in user hands.

Very effective and allows crappy systems and cheap components to be used vs
high cost parts and solutions.


Regards, Dana.
Love Cypress PSOC, ATTiny, Bit Slice, OpAmps, Oscilloscopes, and Analog Gurus like Pease, Miller, Widlar, Dobkin, obsessed with being an engineer
 

Offline CJay

  • Super Contributor
  • ***
  • Posts: 4136
  • Country: gb
Second that, definitely constant current for the LED but I suspect you're going to run into problems with ambient light effects too, you really need to be able to calibrate out the effect of shadow and daylight.

One way to do that would be to pulse the LED, when the LED is off the sensor gives a value for ambient light which can be taken out of the equation and you should then be able to 'level' the results to give a true representation of the opacity between sensor and source.

Your constant current source doesn't need to be particularly complicated and for the currents you're talking about a FET, Bipolar and a couple of resistors will suffice if you don't have access to a more complex solution.
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf