Electronics > Beginners
Beginner questions about determining DC power supply requirements
(1/3) > >>
Electro Fan:
Some beginner questions about determining DC power supply requirements.

1.   If a device doesn’t clearly specify the DC input power connector as being center pin positive or center pin negative, how would you determine the polarity?  Is it necessary to open the device and trace the circuit or is there a way to probe/test the center pin and/or outer barrel to determine the polarity?  (Remember, in this situation, no power is applied to the device through the connector because we don't yet know with any confidence the connector's polarity – so we are trying to figure out the center pin polarity with the device being in an off state.)

2.   (Separate question from above, in this case we know the center pin polarity.)  If a device doesn’t specify it’s current requirement (no label or documentation) but it does specify voltage (along with the correct center pin polarity) on a chassis label – but the voltage is specified as a range (such as 13V to 24V), how would you go about a) determining the current requirement?  And b) once you know the current requirement either because you figured it out through testing or because you found the documentation, how would you go about deciding what voltage to apply within the range?  For example, let’s say through testing or documentation we learn that the device draws a maximum of 3 Amps.  How would you select the voltage within the 13V – 24V range?  Would go toward the low end (13V), the high end (24V), or somewhere in the middle?  What would drive your decision on the particular voltage within the specified range?

Thanks
ArthurDent:
On #2, if you know it takes 13-24VDC and you know max current and the polarity, I'd say-it depends. Using a bench supply with V and A meters I would apply an increasing voltage from 13-24 VDC and see if the current remains pretty constant. If it does that probably means it has a linear regulator inside so any voltage over the minimum required will be dissipated as heat. In this case I'd go with 13-15 VDC depending on what you have available for a supply in that range and this will allow the device to run cooler and waste less energy as heat.

If when you increase the voltage from 13-24 VDC you see the current drop to almost 1/2 value then the device probably has a switching DC-DC converter inside and you could use any voltage in that range but I'd favor close to 24 VDC unless you want to power it in your car. I have some telco GPS units that will operate from 18-72 VDC and the current draw varies greatly depending on the supply voltage I use.

On #1 I'd check inside to see where the power goes to a big capacitor and see if I could figure it out from that. Sometimes the input may go through a diode for reverse protection and you can either see that or you can try to see if you can figure it out with a DMM in the input pins without taking the device apart.
rstofer:
Plug polarity:  I would use a DMM.  With a positive voltage in the display (as opposed to a negative value) and given that the red probe is positive, it's pretty easy to figure out.

I would choose a supply voltage that is convenient.  If the application is automotive, I would be on the low voltage end of things.

If I didn't know the required current, maybe I could find a wattage and work backwards.  Either that or I would just use a car battery knowing that it can provide up to 1000A for a brief period.  I would measure the current with my DMM.  I would probably put a 5A or 10A in series.  The meter probably has a 10A fuse and I wouldn't really want to blow it.

If you do use a car battery, make sure you get the polarity right or you'll see just what 1000A of 12V can do!

An alarm battery or motorcycle battery would probably be safer.

All above assuming you don't have a bench PS capable of delivering, say, 3A at 30V.

Somehow, you should be able to intuit whether the thing is going to take milliamps or megaamps.

Why not tell us what kind of device you're working on?

Electro Fan:

--- Quote from: rstofer on May 31, 2019, 07:47:45 pm ---Plug polarity:  I would use a DMM.  With a positive voltage in the display (as opposed to a negative value) and given that the red probe is positive, it's pretty easy to figure out.

I would choose a supply voltage that is convenient.  If the application is automotive, I would be on the low voltage end of things.

If I didn't know the required current, maybe I could find a wattage and work backwards.  Either that or I would just use a car battery knowing that it can provide up to 1000A for a brief period.  I would measure the current with my DMM.  I would probably put a 5A or 10A in series.  The meter probably has a 10A fuse and I wouldn't really want to blow it.

If you do use a car battery, make sure you get the polarity right or you'll see just what 1000A of 12V can do!

An alarm battery or motorcycle battery would probably be safer.

All above assuming you don't have a bench PS capable of delivering, say, 3A at 30V.

Somehow, you should be able to intuit whether the thing is going to take milliamps or megaamps.

Why not tell us what kind of device you're working on?

--- End quote ---

Hi rstofer,

Thanks.  On the polarity, I don't see how you can get a DMM to tell you the polarity if there is no power present; and to apply power without damaging the powered device you would have to know the polarity.  See the catch 22 here?  In this "use case" there are no red or black wires visible or any markings on the chassis cover - just an opening for a power plug.  By convention, it's probably center pin positive, but it could be center pin negative.  It's easy to test under power, but if you power up in the wrong polarity you could damage the device.  I get that inside there might be some markings or color coded wiring.  So maybe the answer is there is no good way to know without removing the covers.

For the second set of questions, we are assuming we don't know the current requirement or the wattage specs, we just know a range of acceptable voltages.  (For whatever reason the device has relatively poor markings and no documentation.)

As for the device I have in mind we can be pretty sure it will require a few Amps (maybe anywhere from 1-5 and probably 2-4), so I don't think we need or want a car battery.  (Yes, a bench linear supply is available).  Here, the question is what is the best way to choose the specific voltage to be applied given the range of voltage that has been specified.

In summary, the questions are really aimed at learning a) how to determine what the polarity is before powering up a device (so as not to damage it by guessing wrong), and b) then how to determine what voltage within a range of voltages would be best.



Electro Fan:

--- Quote from: ArthurDent on May 31, 2019, 07:37:47 pm ---On #2, if you know it takes 13-24VDC and you know max current and the polarity, I'd say-it depends. Using a bench supply with V and A meters I would apply an increasing voltage from 13-24 VDC and see if the current remains pretty constant. If it does that probably means it has a linear regulator inside so any voltage over the minimum required will be dissipated as heat. In this case I'd go with 13-15 VDC depending on what you have available for a supply in that range and this will allow the device to run cooler and waste less energy as heat.

If when you increase the voltage from 13-24 VDC you see the current drop to almost 1/2 value then the device probably has a switching DC-DC converter inside and you could use any voltage in that range but I'd favor close to 24 VDC unless you want to power it in your car. I have some telco GPS units that will operate from 18-72 VDC and the current draw varies greatly depending on the supply voltage I use.

On #1 I'd check inside to see where the power goes to a big capacitor and see if I could figure it out from that. Sometimes the input may go through a diode for reverse protection and you can either see that or you can try to see if you can figure it out with a DMM in the input pins without taking the device apart.

--- End quote ---

Hi ArthurDent, Thanks!  Sounds like for #1 it requires removing the covers or at least being able to peak inside to examine the circuit.  But if I'm missing something and there is a handy dandy way to determine polarity by just probing the plug and measuring something at the unpowered receptacle on the chassis that would be good to know.  (I think the answer is that end users shouldn't have to measure and test everything and that's why there are standards and in lieu of standards, there should be markings.  In this case there just didn't happen to be sufficient markings, labeling, documentation, etc.)

On #2 your answer helped me learn that I should use a bench supply and start on the low end while observing the current draw.  Your answered helped me learn that if the current stays reasonably constant that the device probably has a linear regulator, and that if the current decreases (nearly in half in this case) that the device probably has a DC-DC switching converter.  I get why with the linear regulator it would make sense to operate near the low end of the voltage range, but please say more about why with the DC-DC converter the high end of the voltage range makes more sense.  Is it because we have to assume that the designer specified the range (while using a DC-DC converter) so that that peak current requirements can be met?  And if so, why didn't the designer just specify the higher end of the voltage range?  Also, why is telco equipment spec'd with voltage ranges?  Just to make it easy to find a power supply to power the equipment?  Or is there some other reason?  Thanks!!
Navigation
Message Index
Next page
There was an error while thanking
Thanking...

Go to full version
Powered by SMFPacks Advanced Attachments Uploader Mod