Electronics > Beginners
charge a super capacitor
nickeevblog10:
Now just thinking then, this power adapter may be ok for charging the capacitor and I could safely begin charging and test the output of the power adapter @ 3V under load on the Capacitor.
How does this sound to you guys?
Thank you very much for your feedback.
StillTrying:
Nothing can be done safely with power supplies, batteries or charged caps until you learn how to measure current without shorting things, it's possible you've already damaged the DVM and/or the 3V supply!
https://www.allaboutcircuits.com/textbook/experiments/chpt-2/ammeter-usage
Psi:
--- Quote from: nickeevblog10 on May 23, 2019, 01:31:50 am ---If I check the pins on a laptop power supply for Voltage I turn the laptop power adapter on and put the probes on the two terminal on the pin that goes into the laptop socket and I usually get a seemingly correct result from a 19 V power adapter of around 19 V. Is this shorting also?
--- End quote ---
No that is not shorting. You can do whatever you like on Volts mode.
The danger is measuring Amps.
You must connect the meter correctly when measuring amps or you might blow something up.
Those 1.2A or 1.9A readings you are getting are dangerous. You are shorting out the power supply.
mariush:
To measure voltage, it's enough to put each probe on the positive and negative terminals.
To measure current going to a device, the multimeter must be placed in-circuit : this means the energy must go through the multimeter and into the device.
So for example, one probe goes to the positive of your power supply and the other probe goes to your device, the other probe becomes the positive of your power supply.
Depending on what range the multimeter is set, there will be some amount of voltage drop inside the multimeter, because the actual measurement is done with a resistor, and you have the basic formula (Ohm's law) Voltage = Current x Resistance.
So for example, on the 10A range of your multimeter a 0.1 ohm resistor may be used but if your multimeter is on the 1A range the resistor may be 1 ohm.
With this example, if you set the multimeter on 1A range and the multimeter has a 1 ohm resistor inside, then if you measure a 5v power supply, at 100 mA there will be 5v - 0.1Ax1 ohm = 4.9v and at 500mA it may be 5v - 0.5a x 1 ohm = 4.5v
Power supplies can provide a limited amount of power, where power means the product of voltage and current.
In your particular case, your power supply may only guarantee 3v at 500 mA , which means it can give you 3v x 0.5A = 1.5 watts of power. There may be some "reserve", some tolerance, so in reality it may give 550-600mA.
If a device needs more, then the power supply may increase the current, but lower the voltage so for example, you may measure 2A of current, but the voltage may only be 0.75v, because the power supply is still only capable of 1.5 watts of power ( 0.75v x 2 = 1.5 watts)
A fancier power supply may turn itself off if a device takes too much current or if the voltage goes below some threshold, to protect devices connected to it.
So when you placed the multimeter directly across the positive and negative wires of your power supply, you more or less created a short circuit between the wires... in reality the very tiny resistor used by the multimeter to measure the current plus the resistance of the multimeter probes were the only things connected.
So the power supply sent as much current through the "load" between the positive and negative wires, which probably was 1.92 A of current .... but you don't know what voltage that was, could have been 1v.
Also, you also don't know if the power supply was actually capable of sustaining that "effort" for long periods of time. Components inside could be damaged or broken after some period.
Another thing you have to understand.
A lot of power supplies regulate a voltage only when something is connected to the power supply and that something consumes a tiny bit of power.
Some power supplies will output a higher voltage with nothing connected to them, but as soon as something is connected, the voltage gets closer to the actual specified voltage.
For example, cheap wall wart adapters that use old transformers may be configured to output 9v but they'll output 10-11v with nothing connected to them. If you connect something as simple as a led (with its current limiting resistor) that consumes 10-15mA of current you may notice that the voltage gets closer to 9v.
Other power supplies use cheap techniques to regulate the output voltage, which can cause power supply to output higher voltage at very low currents. For example, you may see a phone charger output 5.5v with nothing connected to it, but as soon as something consumes power, the voltage would go down closer to 5v.
There's various levels of quality when it comes to power supplies
nickeevblog10:
To test the power adapter @ 3 volts I bought a 3 Volts LED and understand I need to break the circuit between the power adapter and the load (the LED) to measure the Amps @ 3 Volts. Do I need to use a transistor in the circuit also?
EDITED: now looking back at the first reply to my question I see the meter must have a resistor in it to get any Ampere reading so presuming I need not add a second resistor to the test circuit. How am I doing here?
Thanks for taking the time.
Here is the result of the power adapter test fro Amps using a 3 Volt LED see attachment
I am concluding the power adapter is ok although very low Amps to charge the super capacitor.
But how can I test the power adapter to see if it will put out 500mAs?
Navigation
[0] Message Index
[#] Next page
[*] Previous page
Go to full version