When I believed one of my car jump starters was losing some of its internal battery performance I tested its power adaptor at the “pin”. The adaptor is rated 15V DC output and my multimeter read 18V.
Ignorant of much things “electronic” and concerned I may be overcharging the battery I talked to the supplier. They advised 18V output was too high (with 16V the highest acceptable output) and they would find me a replacement adapter.
I own some 30 AC/DC power adaptors with various rated DC outputs (all AC inputs are 240V, Australia). Being a curious type I checked all my power adaptors and found only 12 tested
at, or very close
to, their rated DC output. The other 18 adaptors all produced “high” results ranging from +3V to +8V above their rated DC outputs.
While a lot of my adaptors are quite old (retained as spares after their devices were discarded etc) I was still surprised at the percentage of “duds” I seemed to have and this made me question the validity of what I was doing.
Below is a representative sampling of my results:
Device Rated Output DMM Tested
15 LED worklight 12V 20V
Hand torch, 5W 12V 18V
Jump starter, 1900A 15V 18V
Sanyo cordless phone 9V 14.5V
My question is: should a AC/DC power adapter
always output its rated DC output or be say, within 1V…or is a larger variation acceptable. Is there a general rule-of-thumb e.g. adaptor output should be within say, 10% of its rated output?
I'd appreciate any comments/advice to be in Simple English. Thanks