EEVblog Electronics Community Forum

Electronics => Beginners => Topic started by: Spork Schivago on April 08, 2019, 10:37:20 pm

Title: How do LED backlight testers know what voltage to provide?
Post by: Spork Schivago on April 08, 2019, 10:37:20 pm
Hello,

How do the LED backlight testers know what voltage to safely provide to the strips of LEDs so they do not end up drawing too much current and fry?   I'm talking about the ones, like this one:   https://www.shopjimmy.com/sid-gj2c-led-tv-backlight-tester.htm (https://www.shopjimmy.com/sid-gj2c-led-tv-backlight-tester.htm)

It shows output voltage is between 0VVDC - 300VDC.   How can it safely test strings of different lengths without knowing how many LEDs are there or without knowing the Vf of the LEDs?   Thanks!
Title: Re: How do LED backlight testers know what voltage to provide?
Post by: Nerull on April 08, 2019, 11:26:41 pm
Its probably designed as a constant current source.
Title: Re: How do LED backlight testers know what voltage to provide?
Post by: Doctorandus_P on April 08, 2019, 11:31:12 pm
Indeed. They are current sources and then measure the voltage needed to push enough current through the LED's.
The used current is probably specified in the datasheet of those testers.
Title: Re: How do LED backlight testers know what voltage to provide?
Post by: Spork Schivago on April 09, 2019, 02:17:38 am
I'm not following.   How would constanst current source work in this case?   Let's say for 1 LED vs 20 LEDs, the first one draws 20mA for If and 1.2VDC for Vf, but the second set there, the 20 LEDs, draw 20 mA for If but 3VDC for Vf.   The 20 LEDs would be wired in series.   So what do they do?   Provide a constant source of 20mA, but how do they measure the needed voltage, especially without knowing what it needs?   I am still struggling a little with understanding this.

If I set my PSU to 20mA and the output voltage to 30VDC, I'd think I'd still somehow damage the LED, wouldn't it?   I realize diodes are current driven, not voltage driven.  Thanks!
Title: Re: How do LED backlight testers know what voltage to provide?
Post by: Nerull on April 09, 2019, 02:21:58 am
If you set your PSU to 20mA, it will not be able to output 30V. It will a maximum of 30V, or the voltage required to reach 20mA, whichever is lower.

A constant current source can be as simple as this: https://www.electronics-notes.com/images/transistor-active-current-source-01.svg (https://www.electronics-notes.com/images/transistor-active-current-source-01.svg)
Title: Re: How do LED backlight testers know what voltage to provide?
Post by: xavier60 on April 09, 2019, 03:01:54 am
I posted this on Badcaps Forum a few years ago. Ill add that the 330VDC is produced by an isolated mains powered SMPS.
Quote
I have traced some of the circuitry, it makes better sense now.
With the test leads open, the Source of Q1 rises to the point where the G-S drops to threshold voltage of about 2.5v. Q1's channel resistance will be very high.
When the leads are connected to a load, very little current first flows through Q1, the current increases as C8 charges. Q1 eventually ends up with about 3v drop across it.
The current limiting is done by an unidentified SMPS IC that I have represented by Q2.
Title: Re: How do LED backlight testers know what voltage to provide?
Post by: Brumby on April 09, 2019, 03:24:32 am
So what do they do?   Provide a constant source of 20mA, but how do they measure the needed voltage, especially without knowing what it needs?   I am still struggling a little with understanding this.

A constant current supply does, indeed, have a source of voltage behind it - but the regulation circuit does not worry about the voltage - it ONLY monitors the current.  It will, thus, only allow the appropriate amount of current to pass through - and the voltage will be what ever it turns out to be.
Title: Re: How do LED backlight testers know what voltage to provide?
Post by: Zero999 on April 09, 2019, 08:35:22 am
I'm not following.   How would constanst current source work in this case?   Let's say for 1 LED vs 20 LEDs, the first one draws 20mA for If and 1.2VDC for Vf, but the second set there, the 20 LEDs, draw 20 mA for If but 3VDC for Vf.   The 20 LEDs would be wired in series.   So what do they do?   Provide a constant source of 20mA, but how do they measure the needed voltage, especially without knowing what it needs?   I am still struggling a little with understanding this.
The forward voltage drop of an LED is current dependant. The correct way to drive an LED is with a constant current source. You can use constant voltage, but it's tricky, since the resistance of an LED drops, both with increasing temperature and current. Suppose an LED has a normal forward voltage of 3V at 10mA, so you set the power supply's voltage to 3V, but don't limit the current, the LED might pass 10mA, but if it gets warm, its resistance will fall and it'll pass more than 10mA, causing it to heat up more and pass more current. Using a constant current source, which passes the same current, irrespective of the terminal voltage solves this problem. If the same LED is connected to a 10mA constant current source and it warms up, the resistance will fall, but the power supply will limit the current to 10mA, so the voltage across it will fall, in accordance with Ohm's law and it will heat up less, as P = V*I.

Quote
If I set my PSU to 20mA and the output voltage to 30VDC, I'd think I'd still somehow damage the LED, wouldn't it?   I realize diodes are current driven, not voltage driven.  Thanks!

Now you're referring to a constant current, constant voltage power supply. The PSU has two modes: constant voltage and constant current. It can't regulate both the current and voltage at the same time, because it would violate Ohm's law.

When the power supply has a high resistance connected to it, it will be in constant voltage mode and when a low resistance is connected to it, it will be in constant current mode.

R = V/I = 30/20mA = 30/0.02 = 15 000 = 15k

So when the resistance connected to your power supply is above 15k, it will regulate the voltage and when the resistance is lower, it will regulate the current.

If your LED will have a forward voltage of 3V, at 10mA, it will increase a little at 20mA, but not much, say 3.2V, so its resistance will be R = V/I = 3.2/0.02 = 160R.

Assuming the LED can handle 20mA and power supply can switch between constant voltage and constant current modes instantaneously, switching it on and connecting an LED with a forward voltage of 3V to its output, will cause the voltage to rapidly drop from 30V to 3V and the current to increase from zero to 20mA, so LED won't be damaged.

In reality, the power supply might take a few milliseconds to switch from constant voltage, to constant current mode, briefly allowing a much higher current to flow through the LED, which could damage it. The avoid this, connect the LED to the output of the power supply, before switching it on, then it should start in constant current mode, limiting the current through the LED to 20mA.
Title: Re: How do LED backlight testers know what voltage to provide?
Post by: Spork Schivago on April 09, 2019, 04:32:07 pm
If you set your PSU to 20mA, it will not be able to output 30V. It will a maximum of 30V, or the voltage required to reach 20mA, whichever is lower.

A constant current source can be as simple as this: https://www.electronics-notes.com/images/transistor-active-current-source-01.svg (https://www.electronics-notes.com/images/transistor-active-current-source-01.svg)

So if I set it to 30V output, and set the current to 20mA, there's no risk of damaging the LEDs, assuming I have the polarity correct, and the LED will power up, but only have a Vf of whatever it requires?   So the company who makes the LED testers could just try to throw 300VDC at the strips with 20mA, for example, and then they only measure the Vf across the LED to display it to the user?
Title: Re: How do LED backlight testers know what voltage to provide?
Post by: Spork Schivago on April 09, 2019, 04:41:31 pm
I posted this on Badcaps Forum a few years ago. Ill add that the 330VDC is produced by an isolated mains powered SMPS.
Quote
I have traced some of the circuitry, it makes better sense now.
With the test leads open, the Source of Q1 rises to the point where the G-S drops to threshold voltage of about 2.5v. Q1's channel resistance will be very high.
When the leads are connected to a load, very little current first flows through Q1, the current increases as C8 charges. Q1 eventually ends up with about 3v drop across it.
The current limiting is done by an unidentified SMPS IC that I have represented by Q2.

That's a schematic for one of those testers?   That's create.   I believe I'm starting to understand.   The current flowing through an LED is an exponential function of the voltage across the LED.   Thanks!!!!
Title: Re: How do LED backlight testers know what voltage to provide?
Post by: xavier60 on April 09, 2019, 04:47:49 pm
This looks interesting, http://lednique.com/current-voltage-relationships/iv-curves/ (http://lednique.com/current-voltage-relationships/iv-curves/)
Title: Re: How do LED backlight testers know what voltage to provide?
Post by: xavier60 on April 09, 2019, 04:58:51 pm
I partly traced the circuitry of a LED tester that looks just like the link in the 1st post. The SMPS areas are covered with a hard black epoxy.
I was mainly curious to see how the design got the current to ramp up gradually without causing current spikes when the probes are connected to the LEDs with the tester already powered.
Title: Re: How do LED backlight testers know what voltage to provide?
Post by: AG6QR on April 09, 2019, 07:26:54 pm
So what do they do?   Provide a constant source of 20mA, but how do they measure the needed voltage, especially without knowing what it needs?   I am still struggling a little with understanding this.

Others have already explained, but if it helps, the question is exactly analogous to asking how a voltage source knows how to measure and provide the correct current to an incandescent light bulb without knowing what it needs.

A voltage source provides a certain voltage, measuring that voltage and using a feedback loop to regulate it to required tolerance.  Current ends up being whatever it ends up being.  Swap the terms around and that same logic applies for current sources, with voltage left to vary according to the load.

In both cases, the uncontrolled parameter can only vary within limits, of course, before the supply falls out of regulation.  There's always a limit to the available power. 

A voltage source can maintain regulation while producing zero power into an open circuit, while a current source can maintain regulation while producing zero power into a dead short.
Title: Re: How do LED backlight testers know what voltage to provide?
Post by: soldar on April 09, 2019, 07:50:21 pm
Think of a constant current source as a positive displacement fluid pump. It will displace the same volume of fluid without regard to pressure. (Up to a point, obviously.)

I suggest you have a look at these pages

https://en.wikipedia.org/wiki/Current_source
https://en.wikipedia.org/wiki/Voltage_source
https://en.wikipedia.org/wiki/Ohm%27s_law
https://en.wikipedia.org/wiki/Kirchhoff%27s_circuit_laws
https://en.wikipedia.org/wiki/Millman%27s_theorem
https://en.wikipedia.org/wiki/Duality_(electrical_circuits)
https://en.wikipedia.org/wiki/Th%C3%A9venin%27s_theorem
https://en.wikipedia.org/wiki/Norton%27s_theorem
https://en.wikipedia.org/wiki/Source_transformation
https://en.wikipedia.org/wiki/Network_analysis_(electrical_circuits)
Title: Re: How do LED backlight testers know what voltage to provide?
Post by: Spork Schivago on April 10, 2019, 12:18:51 am
I'm not following.   How would constanst current source work in this case?   Let's say for 1 LED vs 20 LEDs, the first one draws 20mA for If and 1.2VDC for Vf, but the second set there, the 20 LEDs, draw 20 mA for If but 3VDC for Vf.   The 20 LEDs would be wired in series.   So what do they do?   Provide a constant source of 20mA, but how do they measure the needed voltage, especially without knowing what it needs?   I am still struggling a little with understanding this.
The forward voltage drop of an LED is current dependant. The correct way to drive an LED is with a constant current source. You can use constant voltage, but it's tricky, since the resistance of an LED drops, both with increasing temperature and current. Suppose an LED has a normal forward voltage of 3V at 10mA, so you set the power supply's voltage to 3V, but don't limit the current, the LED might pass 10mA, but if it gets warm, its resistance will fall and it'll pass more than 10mA, causing it to heat up more and pass more current. Using a constant current source, which passes the same current, irrespective of the terminal voltage solves this problem. If the same LED is connected to a 10mA constant current source and it warms up, the resistance will fall, but the power supply will limit the current to 10mA, so the voltage across it will fall, in accordance with Ohm's law and it will heat up less, as P = V*I.

Quote
If I set my PSU to 20mA and the output voltage to 30VDC, I'd think I'd still somehow damage the LED, wouldn't it?   I realize diodes are current driven, not voltage driven.  Thanks!

Now you're referring to a constant current, constant voltage power supply. The PSU has two modes: constant voltage and constant current. It can't regulate both the current and voltage at the same time, because it would violate Ohm's law.

When the power supply has a high resistance connected to it, it will be in constant voltage mode and when a low resistance is connected to it, it will be in constant current mode.

R = V/I = 30/20mA = 30/0.02 = 15 000 = 15k

So when the resistance connected to your power supply is above 15k, it will regulate the voltage and when the resistance is lower, it will regulate the current.

If your LED will have a forward voltage of 3V, at 10mA, it will increase a little at 20mA, but not much, say 3.2V, so its resistance will be R = V/I = 3.2/0.02 = 160R.

Assuming the LED can handle 20mA and power supply can switch between constant voltage and constant current modes instantaneously, switching it on and connecting an LED with a forward voltage of 3V to its output, will cause the voltage to rapidly drop from 30V to 3V and the current to increase from zero to 20mA, so LED won't be damaged.

In reality, the power supply might take a few milliseconds to switch from constant voltage, to constant current mode, briefly allowing a much higher current to flow through the LED, which could damage it. The avoid this, connect the LED to the output of the power supply, before switching it on, then it should start in constant current mode, limiting the current through the LED to 20mA.

Thank you for that beautiful explaination!   It answers my question perfectly!    What is a safe current for unknown LEDs?   20mA seemed resonable.   I realize there are some LEDs that might require a lot of current, such as 1 amp, however, I think I'd (hopefully) be able to find some datasheet or recongize them as not so common LEDs.   I wonder if I could set voltage to 0 or unset voltage somehow and just set current, to make my Rigol DP832 a constant current source when testing LED strips.
Title: Re: How do LED backlight testers know what voltage to provide?
Post by: Spork Schivago on April 10, 2019, 12:25:55 am
I partly traced the circuitry of a LED tester that looks just like the link in the 1st post. The SMPS areas are covered with a hard black epoxy.
I was mainly curious to see how the design got the current to ramp up gradually without causing current spikes when the probes are connected to the LEDs with the tester already powered.
Why do you think they wanted the current to ramp up gradually?   Why not just try to filter out any surges?

Did you ever figure out the unknown components values?   I'd love to try and build one of these testers.   I need to fully understand how it works one of these days in the near future.
Title: Re: How do LED backlight testers know what voltage to provide?
Post by: Spork Schivago on April 10, 2019, 12:29:32 am
So what do they do?   Provide a constant source of 20mA, but how do they measure the needed voltage, especially without knowing what it needs?   I am still struggling a little with understanding this.

Others have already explained, but if it helps, the question is exactly analogous to asking how a voltage source knows how to measure and provide the correct current to an incandescent light bulb without knowing what it needs.

A voltage source provides a certain voltage, measuring that voltage and using a feedback loop to regulate it to required tolerance.  Current ends up being whatever it ends up being.  Swap the terms around and that same logic applies for current sources, with voltage left to vary according to the load.

In both cases, the uncontrolled parameter can only vary within limits, of course, before the supply falls out of regulation.  There's always a limit to the available power. 

A voltage source can maintain regulation while producing zero power into an open circuit, while a current source can maintain regulation while producing zero power into a dead short.
I don't know why, but I never looked at it that way until today, reading the previous posts.   I knew about a bulb and the current it'd draw, regardless, but I was looking at voltage differently.   I think I have a little more trouble with the voltage because I can power devices with too much voltage and fry them.   I think that stopped me from thinking about switching the terms around like that.   I'm learning a lot and I appreciate everyone's help.   I think I got this now.   Thanks!
Title: Re: How do LED backlight testers know what voltage to provide?
Post by: Spork Schivago on April 10, 2019, 12:34:24 am
Think of a constant current source as a positive displacement fluid pump. It will displace the same volume of fluid without regard to pressure. (Up to a point, obviously.)

I suggest you have a look at these pages

https://en.wikipedia.org/wiki/Current_source
https://en.wikipedia.org/wiki/Voltage_source
https://en.wikipedia.org/wiki/Ohm%27s_law
https://en.wikipedia.org/wiki/Kirchhoff%27s_circuit_laws
https://en.wikipedia.org/wiki/Millman%27s_theorem
https://en.wikipedia.org/wiki/Duality_(electrical_circuits)
https://en.wikipedia.org/wiki/Th%C3%A9venin%27s_theorem
https://en.wikipedia.org/wiki/Norton%27s_theorem
https://en.wikipedia.org/wiki/Source_transformation
https://en.wikipedia.org/wiki/Network_analysis_(electrical_circuits)

Yes, I think reading those will help, and probably bring a bunch more questions.   I also have a book I've just started reading, called The Art of Electronics, by Paul Horowitz and Winfield Hill, Third Edition.   From some of the damage I sustained in the Marines, I have some trouble reading from physical books for some reason, but digitally, on a computer screen, I seem to retain the information much, much better.   The authors where kind enough to provide me with a free digital copy because of this.  I had purchased the hard copy, then asked about a digital, and they sent it right away.

I got stuck in the book when I got to some calculas early on, so I went and bought a book that teaches me calculas.   I think I'm ready now to continue on with the Art of Electronics.
Title: Re: How do LED backlight testers know what voltage to provide?
Post by: amyk on April 10, 2019, 01:26:22 am
There's one thought which someone has tried to ask but hasn't been answered yet, so I'll give it a try --- if you have parallel LEDs, or LEDs with a higher rated current, then a constant current source will (ideally, not quite so in practice) just divide the current among them, and even (or perhaps especially?) high-current LEDs will visibly light at much lower currents than their maximum. That and the fact that large parallel circuits of LEDs are very rarely encountered means the tester still works; <1mA is enough to cause a visible glow for just about all LEDs I've seen.
Title: Re: How do LED backlight testers know what voltage to provide?
Post by: Spork Schivago on April 10, 2019, 05:31:57 pm
There's one thought which someone has tried to ask but hasn't been answered yet, so I'll give it a try --- if you have parallel LEDs, or LEDs with a higher rated current, then a constant current source will (ideally, not quite so in practice) just divide the current among them, and even (or perhaps especially?) high-current LEDs will visibly light at much lower currents than their maximum. That and the fact that large parallel circuits of LEDs are very rarely encountered means the tester still works; <1mA is enough to cause a visible glow for just about all LEDs I've seen.
I need to design and build some UV LED strips.   I look at the TVs and pay attention to how they do it, to try and improve my eventual design.   I want a couple versions of this device I'm building.   They would only differ in size and the number of LEDs.    I wanted one firmware for all three, and I wanted to be able to just add more LEDs without having to modify anything.   So we can pretend the three devices would be like this:   

Device A:  9 LEDs
Device B: 18 LEDs
Device C: 27 LEDs

The PSU for the LEDs would be a constant current source which is able to provide enough voltage for 27 LEDs.   With strips installed that provide a total of 9 LEDs, that PSU should work just fine, being a constant current source.   I was going to have strips, and the strips you could daisy chain together.   You'd just plug another strip in.   The LEDs in the strip would be wired in series.   The strips themselves would be wired in parallel.   I think that should work fine.   When the time comes, I will upload schematics and / or possible board layouts here, so people can provide suggestions on improvements, I think.
Title: Re: How do LED backlight testers know what voltage to provide?
Post by: ebastler on April 10, 2019, 06:29:47 pm
The PSU for the LEDs would be a constant current source which is able to provide enough voltage for 27 LEDs.   
...
The LEDs in the strip would be wired in series.   The strips themselves would be wired in parallel.   

But then you will have to provide separate current sources, one for each of the strips. Each would be set to the same target current, but e.g. the current source powering the 27-LED strip will put out a different voltage than the current source which powers the 9-LED strip.

If that's what you intend to build, it should work. But a single current source will not work.

I would probably keep it simple and build a single constant voltage source. Wire all the 9-LED chains in parallel (whether they are arranged away from each other or as one longer chain). Within each chain, wire the 9 LEDs in series and provide a series resistor which defines the current.
Title: Re: How do LED backlight testers know what voltage to provide?
Post by: Spork Schivago on April 10, 2019, 06:45:16 pm
The PSU for the LEDs would be a constant current source which is able to provide enough voltage for 27 LEDs.   
...
The LEDs in the strip would be wired in series.   The strips themselves would be wired in parallel.   

But then you will have to provide separate current sources, one for each of the strips. Each would be set to the same target current, but e.g. the current source powering the 27-LED strip will put out a different voltage than the current source which powers the 9-LED strip.

If that's what you intend to build, it should work. But a single current source will not work.

I would probably keep it simple and build a single constant voltage source. Wire all the 9-LED chains in parallel (whether they are arranged away from each other or as one longer chain). Within each chain, wire the 9 LEDs in series and provide a series resistor which defines the current.

That's what I was planning.   Either I'm confused, or I was confusing.    Let me try a little different like:
Device A:  3 rows of strips, each strip has a total of 3 LEDs (just a simplified explaination), 3 strips total
Device B: 3 rows of strips, each strip has a total of 3 LEDs, 6 strips total.
Device C: 3 rows of strips, each strip has a total of 3 LEDs, 9 strips total.
Code: [Select]
DEVICE A:
┌┐  ┌┐  ┌┐
││  ││  ││
└┘  └┘  └┘

DEVICE B:
┌┐  ┌┐  ┌┐
││  ││  ││
├┤  ├┤  ├┤
││  ││  ││
└┘  └┘  └┘

DEVICE C:
┌┐  ┌┐  ┌┐
││  ││  ││
├┤  ├┤  ├┤
││  ││  ││
├┤  ├┤  ├┤
││  ││  ││
└┘  └┘  └┘
1   2   3
Why couldn't I use the same constant current source for each?   I wasn't thinking the width would change, but maybe I should.   With a constant voltage source, I would need to have a different source for each device, wouldn't I?   Or use a different resistor for each device, which I didn't want to do, but could if it over complicated the design at all doing it the way I wanted to.

So yeah, same power supply for all three devices, but it would put out a different voltage for the three different.   It'd be capable of providing enough voltage for Device C, but should work equally as well for Device A and B.   The LEDs will determine how much voltage they need based on the current I provide.
Title: Re: How do LED backlight testers know what voltage to provide?
Post by: ebastler on April 10, 2019, 07:04:01 pm
So yeah, same power supply for all three devices, but it would put out a different voltage for the three different.   It'd be capable of providing enough voltage for Device C, but should work equally as well for Device A and B.   The LEDs will determine how much voltage they need based on the current I provide.

Just to clarify: By "same power supply for all three devices", you mean that you have three power supplies, which are identical copies of the same design, but each device has its own copy, right?

OK, that's somewhat better. From your earlier post, I had understood that you wanted to combine LED chains of different lengths in one single device, and power them from one single current source. In the arrangements you have drawn at least the chains in each device require the same current.

But it still is not a clean design. The constant current your source provides will split itself over the three LED chains in the device -- but you cannot control the split. If one LED chain has slightly lower forward voltage than the other chains (due to parts tolerances), it will get all the current!

You can either connect all the LEDs in series, and provide a constant current. (At an inconventiently high voltage, if the total number of LEDs is high.) Or, as I said in my prior posts, provide resistors for group of LEDs which define the current, and apply a constant voltage to everything.
Title: Re: How do LED backlight testers know what voltage to provide?
Post by: Spork Schivago on April 10, 2019, 07:15:22 pm
So yeah, same power supply for all three devices, but it would put out a different voltage for the three different.   It'd be capable of providing enough voltage for Device C, but should work equally as well for Device A and B.   The LEDs will determine how much voltage they need based on the current I provide.

Just to clarify: By "same power supply for all three devices", you mean that you have three power supplies, which are identical copies of the same design, but each device has its own copy, right?
Yes sir.

OK, that's somewhat better. From your earlier post, I had understood that you wanted to combine LED chains of different lengths in one single device, and power them from one single current source. In the arrangements you have drawn at least the chains in each device require the same current.
Yes, and at the bottom, where 1 2 3 is labeled, there will be one power strip that all three connect to.   So the strips themselves will be wired in parallel, but the LEDs on the strips will be wired in series.

But it still is not a clean design. The constant current your source provides will split itself over the three LED chains in the device -- but you cannot control the split. If one LED chain has slightly lower forward voltage than the other chains (due to parts tolerances), it will get all the current!
I am glad we are discussing this.   This is the information I seek.   I want to design it proper like and in the best way possible.   Would providing that constant voltage source instead of a constant current source fix this?   I worry with a constant voltage source that if one strip goes out, the others will end up getting fried...that was why I asked the original question, trying to understand how someone can power multiple LEDs without knowing how many LEDs will be present.   I should have probably worded my question differently, but I felt it was easiest to show a device that does what I want to do and ask how it worked, so I didn't confuse a lot of people.

You can either connect all the LEDs in series, and provide a constant current. (At an inconventiently high voltage, if the total number of LEDs is high.) Or, as I said in my prior posts, provide resistors for group of LEDs which define the current, and apply a constant voltage to everything.
Yeah, but if that constant voltage, if one of the LEDs fail, then the other LEDs will asborb that voltage, current will rise exponentially, and it will damage the LEDs, right?   That's where I have been struggling with trying to find a way around this, until I learned about the constant current source.   It seemed the constant current source was my answer.   How do the TV manufactueres do it?   That's what I need in the end, just like an LED backlit TV.   What type of high voltage are we talking about?   With the TVs, it's what I would consider fairly high, sometimes at around 280VDC, maybe even higher.   I wouldn't say over 300VDC though.   Is that inconventiently high voltage?   I will go count the UV LEDs for device C and see how many there are total.   That might help a little.
Title: Re: How do LED backlight testers know what voltage to provide?
Post by: Spork Schivago on April 10, 2019, 07:32:09 pm
Currently, 12 strips of UV LEDs, 16 UV LEDs per strip.    However, I think this number of LEDs can be greatly reduced by me using better UV LEDs (perhaps from some source such as DigiKey or Mouser, instead of Alibaba), perferably SMD UV LEDs, and by using a diffuser.   First prototype has little boards, with 8 columns of LEDs, each column having 4 LEDs each, with a total of 32 LEDs per PCB and an overall total of 384 UV LEDs.  I daisy chain the PCBs together to get one large PCB.

12 PCBs total, configured like this:
Code: [Select]
X X X
X X X
X X X
X X X

or like this:
X X X X
X X X X
X X X X

So 12 PCBs, 8 columns of strips each, total of 32 UV LEDs per PCB.   The rows are off an UV LED every other row.   So the first LED starts right at the beginning of the board, the second column has the first UV LED starting where the second LED from the first column would be.   The third column of UV LEDs start flush with the board like the first column.   I hope that makes sense.
Title: Re: How do LED backlight testers know what voltage to provide?
Post by: ebastler on April 10, 2019, 07:53:38 pm
Yeah, but if that constant voltage, if one of the LEDs fail, then the other LEDs will asborb that voltage, current will rise exponentially, and it will damage the LEDs, right?   

If a single LED fails with a short circuit, the voltage cource would indeed send a somewhat higher current through the LED chain. How much hig, will depend on your choice of series resistor and drive voltage. If you let the resistor drop the majority of the voltage, the loss of one LED will have very little effect. (But the resistor will get warm...)

In contrast, if one of the LEDs fails with an open circuit in your constant current design, a whole chain of LEDs will no longer conduct current. Each of the remaining two chains will see 50% more current than before. That's not healthy either...

Quote
What type of high voltage are we talking about?   With the TVs, it's what I would consider fairly high, sometimes at around 280VDC, maybe even higher.   I wouldn't say over 300VDC though.   Is that inconventiently high voltage?   

Up to 60V DC are considered inherently safe (for users). So no need to take failsafe or redundant precautions against the user touching the conducting parts. I would aim to stay below that.
Title: Re: How do LED backlight testers know what voltage to provide?
Post by: Spork Schivago on April 10, 2019, 09:25:39 pm
Yeah, but if that constant voltage, if one of the LEDs fail, then the other LEDs will asborb that voltage, current will rise exponentially, and it will damage the LEDs, right?   

If a single LED fails with a short circuit, the voltage cource would indeed send a somewhat higher current through the LED chain. How much hig, will depend on your choice of series resistor and drive voltage. If you let the resistor drop the majority of the voltage, the loss of one LED will have very little effect. (But the resistor will get warm...)

In contrast, if one of the LEDs fails with an open circuit in your constant current design, a whole chain of LEDs will no longer conduct current. Each of the remaining two chains will see 50% more current than before. That's not healthy either...
...

Why would an LED failing open circuit cause the other LEDs to draw more current in a constant current design?   I thought because the current was constant, it would not change.   If I am feeding the main power rail that the individual strips connect to, let's say 20mA of current, regardless of how many strips are there, wouldn't it always stay 20mA?   I thought this was the entire purpose of a constant current source.   I must have missed something somewhere.

Finding quality UV LEDs have been a struggle.   I think 350nm to 450nm is the wavelength I need, albiet it's been quiet a while since I worked on the project and did the research back then.   It's for a UV exposure chamber that I'm building, to transfer images of PCBs from transparencys to PCBs.
Title: Re: How do LED backlight testers know what voltage to provide?
Post by: ebastler on April 11, 2019, 04:36:58 am
Why would an LED failing open circuit cause the other LEDs to draw more current in a constant current design?   

I was referring to your arrangement with three chains of LEDs, where the chains are connected in parallel and driven by a single current source. The current source will be designed to provide 3x the current which a single LED chain needs. If one LED fails open circuit, its whole chain can no longer conduct any current. So the 3x current will now be distributed over only 2 chains.

(But, as stated earlier, that parallel connection is not a good idea anyway, even if no LED fails. The distribution of current over the three chains is not well-defined and will depend on parts tolerances of the LEDs.)
Title: Re: How do LED backlight testers know what voltage to provide?
Post by: Spork Schivago on April 12, 2019, 05:28:05 am
So what would be the best design?   Would it be best to not wire the three strings in parallel, and just keep them all wired in series?   So no matter which way one fails, the rest survive?   That's the only safe way I could come up with now.   But I do not yet know of all the various electronical components available.   
Title: Re: How do LED backlight testers know what voltage to provide?
Post by: Zero999 on April 13, 2019, 06:28:54 pm
LEDs get wired in parallel all the time. As long as they're the same type it doesn't matter. Yes, one LED could fail open circuit, causing the others to receive a higher current and fail, but by the time that happens, the lamp will be reaching the end of its life and be due for replacement.
Title: Re: How do LED backlight testers know what voltage to provide?
Post by: Spork Schivago on April 13, 2019, 06:32:42 pm
What's the best way to safely test LED strips?   I set my Rigol DP832 to 20mA with OCP (Over Current Protection) to 30mA.   I set the voltage to 30VDC.   I turn the chanel on before anything is hooked up, I see voltage rising, and then it trips an over current protection.   So I turn the channel off, I hook the probes up, turn it back on, the LEDs light.   Current stays at 30mA and voltage goes all the way up to 26VDC.   On a total of 13 LEDs, on this strip, only 4 or 5 light up (because I fried it before thinking they where 3VDC LEDs).   How can I tell what voltage to use when I cannot find a datasheet or any info on the LEDs?   I would remove one and use my DCA meter to measure them, that would give me the vF, but that could be a lot of work if I had to do that with every TV I come across that shows symptoms of possibly bad LEDs.
Title: Re: How do LED backlight testers know what voltage to provide?
Post by: Zero999 on April 13, 2019, 06:48:02 pm
The second approach is the correct one: set the current to something safe for the LED strip, such as 20mA, with as higher voltage as possible, connect the LED strip and turn on the power. The power supply will limit the current, with the forward voltage of the LEDs determining the voltage.
Title: Re: How do LED backlight testers know what voltage to provide?
Post by: Spork Schivago on April 15, 2019, 04:47:12 pm
I want to make sure I understand why if one LED fails open in my parallel circuit design the other two strips will receive more current.   It's already been said, but I want to make sure I understand.   It's because in a parallel circuit, the current is equal to the current going through each node, added together, right?   So, my constant current source would need to provide 60mA if I wanted each strip to have a total of 20mA.   Three strips, each node receiving 20mA, 20mA + 20mA + 20mA = 60mA.   That's why if one fails open, that strip would no longer work, and the other two would receive more, right?
Title: Re: How do LED backlight testers know what voltage to provide?
Post by: macboy on April 15, 2019, 05:19:42 pm
If you set your PSU to 20mA, it will not be able to output 30V. It will a maximum of 30V, or the voltage required to reach 20mA, whichever is lower.

A constant current source can be as simple as this: https://www.electronics-notes.com/images/transistor-active-current-source-01.svg (https://www.electronics-notes.com/images/transistor-active-current-source-01.svg)

So if I set it to 30V output, and set the current to 20mA, there's no risk of damaging the LEDs, assuming I have the polarity correct, and the LED will power up, but only have a Vf of whatever it requires?   So the company who makes the LED testers could just try to throw 300VDC at the strips with 20mA, for example, and then they only measure the Vf across the LED to display it to the user?

For an ideal current source, yes this would work.
A lab bench power supply is not an ideal current source! So don't do this.

An ideal current source has an "infinite" internal AC impedance at the terminals. This means there will be no change in output current for a change in voltage. An ideal voltage source has a zero AC impedance at the terminals, so that there will be no change in voltage for a change in current. In a normal lab power supply, the voltage regulator feedback loop, combined with some capacitance at the output terminals, provides the low AC impedance (in constant voltage mode). In constant current mode, the current regulation loop does provide a high impedance, attempting to keep current constant even as voltage changes, but the problem is that the output capacitor is still present across the output terminals. This is the big deadly sin which prevents true current-source operation for a typical lab power supply. Some special current source instruments have always been available for applications which can't tolerate that capacitance. One such is the HP 6186C. It has no output capacitor.

The output capacitor will (independent of the active regulation) provide a high current spike to a load that is connected across the output when any voltage is present. This can easily kill a LED or Zener that you are trying to test at a fixed current.  Instead, you must first set the desired current (20 mA etc.), then set the voltage the zero(-ish), then connect the device under test, then ramp up the voltage. Above some voltage setting, the supply will switch to constant current mode, then you can measure the output voltage to find the forward voltage of your LED or the breakdown voltage of your zener (etc.).
Title: Re: How do LED backlight testers know what voltage to provide?
Post by: Zero999 on April 15, 2019, 06:21:17 pm
If you set your PSU to 20mA, it will not be able to output 30V. It will a maximum of 30V, or the voltage required to reach 20mA, whichever is lower.

A constant current source can be as simple as this: https://www.electronics-notes.com/images/transistor-active-current-source-01.svg (https://www.electronics-notes.com/images/transistor-active-current-source-01.svg)

So if I set it to 30V output, and set the current to 20mA, there's no risk of damaging the LEDs, assuming I have the polarity correct, and the LED will power up, but only have a Vf of whatever it requires?   So the company who makes the LED testers could just try to throw 300VDC at the strips with 20mA, for example, and then they only measure the Vf across the LED to display it to the user?

For an ideal current source, yes this would work.
A lab bench power supply is not an ideal current source! So don't do this.

An ideal current source has an "infinite" internal AC impedance at the terminals. This means there will be no change in output current for a change in voltage. An ideal voltage source has a zero AC impedance at the terminals, so that there will be no change in voltage for a change in current. In a normal lab power supply, the voltage regulator feedback loop, combined with some capacitance at the output terminals, provides the low AC impedance (in constant voltage mode). In constant current mode, the current regulation loop does provide a high impedance, attempting to keep current constant even as voltage changes, but the problem is that the output capacitor is still present across the output terminals. This is the big deadly sin which prevents true current-source operation for a typical lab power supply. Some special current source instruments have always been available for applications which can't tolerate that capacitance. One such is the HP 6186C. It has no output capacitor.

The output capacitor will (independent of the active regulation) provide a high current spike to a load that is connected across the output when any voltage is present. This can easily kill a LED or Zener that you are trying to test at a fixed current.  Instead, you must first set the desired current (20 mA etc.), then set the voltage the zero(-ish), then connect the device under test, then ramp up the voltage. Above some voltage setting, the supply will switch to constant current mode, then you can measure the output voltage to find the forward voltage of your LED or the breakdown voltage of your zener (etc.).
Yes, the output capacitor messes up the constant current transient response. Slowly ramping up the voltage is one way to overcome it. Other options are to connect the LED to the output, with the power off, then turn the power on, or short circuit the power supply, connect the LEDs and remove the shorting link.

I'm thinking of designing a bench power supply with decent CC and well as CV regulation, perhaps with a switch to select between optimum current and voltage regulation.

I want to make sure I understand why if one LED fails open in my parallel circuit design the other two strips will receive more current.   It's already been said, but I want to make sure I understand.   It's because in a parallel circuit, the current is equal to the current going through each node, added together, right?   So, my constant current source would need to provide 60mA if I wanted each strip to have a total of 20mA.   Three strips, each node receiving 20mA, 20mA + 20mA + 20mA = 60mA.   That's why if one fails open, that strip would no longer work, and the other two would receive more, right?
Yes, that's right. Another thing to note is, if one LED in a string failed short circuit, that string would hog all of the current, causing the other strings to turn off or go very dim, before the whole string fails, from overheating.
Title: Re: How do LED backlight testers know what voltage to provide?
Post by: Atanas79 on October 02, 2020, 12:08:28 am
Hello
I need your help .
In a topic that is very relevant - explained how the Chinese TV lighting tester works - you had pointed out part of a scheme for a smooth increase in current only when there is a load.
Would you send me the full scheme.
To add it to my homemade tester - which looks like this:

(https://imgur.com/a/zrxr7sM)