Electronics > Beginners

Why does the "dim bulb test" limit current instead of adding load to the supply?

(1/6) > >>

Beamin:
This always seemed backwards to me. I always though that if you took a 100 watt load (light bulb) and put it in series with say another 100 watt load electronic device that you would pull 200 watts out of your power supply. But in reality the most it pulls is 100 then when the load is added the current stays at 100 but dims the bulb? To figure out which device; bulb or load gets what % of the hundred watts you would have to measure the ohms of the bulb while its hot and your electronic device? Then E=IR it? Do bulb behave as temp controlled variable resistors? Or is there something special about the bulb where its resistance changes in proportion to filament temp? I always thought 100 watts + 100 watts = 200 watts draw. With no real light bulbs to test I can't do this.


Also instead of a 100W light bulb that's illegal can I substitute it with a CFL with a really high CRI that says it's as bright as a 100 watt on the box?  :-DD Will I get arrested if I plug in a 100 watt filament bulb by the signals it sends to the power company? :)

Rerouter:
a 100W bulb when it is cold draws more than 100W, as the coppers resistance is lower while its cold,

The bulb is there because it limits the maximum, and if you see the bulb go bright, you know right away something has gone wrong.
It also lets you test things with a decent startup current, again they have a lower resistance while cold. VS just a fixed load resistor which will droop much more.

sokoloff:
A dim bulb is a series resistor that has low resistance at low filament temp (low current) and higher resistance as current increases (and filament heats up).

To figure the power, it’s a simple series resistor problem. With a short circuit in-series with the bulb, the system draws 100 watts (just like when the short circuit is the mains wiring in a normal lamp install). That’s the most power the circuit will draw.

Audioguru:

--- Quote from: Rerouter on August 18, 2018, 01:17:07 pm ---a 100W bulb when it is cold draws more than 100W, as the coppers resistance is lower while its cold
--- End quote ---
The filament is not copper that would melt, it is tungsten. The resistance of tungsten is also lower when cold.

CatalinaWOW:
There are two different problems here.  A one hundred watt bulb consumes one hundred watt when line voltage is applied and when it is fully heated.  If you put two of them in series neither will get line voltage, they each get half.  If they were simple resistors, cutting their voltage in half would drop power by four.  Each bulb would consume 25 watts, the total would be fifty watts. 

The answer is complicated by the temperature varying resistance of the bulb, but if the bulbs are balanced the result is the same.

Putting a bulb in series with a unit under test allows almost full current for startup.  But if a short or other high current fault exists the bulb will heat up rapidly and limit current, hopefully to a non damaging level.  A low wattage bulb limits the current to a lower level and might impede operation of a properly functioning device so the wattage of the bulb used must be selected thoughtfully.

Navigation

[0] Message Index

[#] Next page

There was an error while thanking
Thanking...
Go to full version
Powered by SMFPacks Advanced Attachments Uploader Mod