Plug polarity: I would use a DMM. With a positive voltage in the display (as opposed to a negative value) and given that the red probe is positive, it's pretty easy to figure out.
I would choose a supply voltage that is convenient. If the application is automotive, I would be on the low voltage end of things.
If I didn't know the required current, maybe I could find a wattage and work backwards. Either that or I would just use a car battery knowing that it can provide up to 1000A for a brief period. I would measure the current with my DMM. I would probably put a 5A or 10A in series. The meter probably has a 10A fuse and I wouldn't really want to blow it.
If you do use a car battery, make sure you get the polarity right or you'll see just what 1000A of 12V can do!
An alarm battery or motorcycle battery would probably be safer.
All above assuming you don't have a bench PS capable of delivering, say, 3A at 30V.
Somehow, you should be able to intuit whether the thing is going to take milliamps or megaamps.
Why not tell us what kind of device you're working on?
Hi rstofer,
Thanks. On the polarity, I don't see how you can get a DMM to tell you the polarity if there is no power present; and to apply power without damaging the powered device you would have to know the polarity. See the catch 22 here? In this "use case" there are no red or black wires visible or any markings on the chassis cover - just an opening for a power plug. By convention, it's probably center pin positive, but it could be center pin negative. It's easy to test under power, but if you power up in the wrong polarity you could damage the device. I get that inside there might be some markings or color coded wiring. So maybe the answer is there is no good way to know without removing the covers.
For the second set of questions, we are assuming we don't know the current requirement or the wattage specs, we just know a range of acceptable voltages. (For whatever reason the device has relatively poor markings and no documentation.)
As for the device I have in mind we can be pretty sure it will require a few Amps (maybe anywhere from 1-5 and probably 2-4), so I don't think we need or want a car battery. (Yes, a bench linear supply is available). Here, the question is what is the best way to choose the specific voltage to be applied given the range of voltage that has been specified.
In summary, the questions are really aimed at learning a) how to determine what the polarity is before powering up a device (so as not to damage it by guessing wrong), and b) then how to determine what voltage within a range of voltages would be best.