Author Topic: switching-PSUs: why can't you usually exceed the 50% of declared power?  (Read 11751 times)

0 Members and 1 Guest are viewing this topic.

Offline Alex Eisenhut

  • Super Contributor
  • ***
  • Posts: 3550
  • Country: ca
  • Place text here.
You cannot assess a PSU with light bulb loads OR its capability if loading only one rail !

Each output can only be loaded to its rated output and several output loads may be necessary to reach its full rated output in watts.....but specs may be only in watts consumption NOT total output.

Further, tungsten bulbs have high initial current draw until the filament reaches operating temp where its resistance is much lower than when cold. This inrush current could send the SMPS into shutdown.

All ^^^ traps for the unwary !

Speaking of traps, I think you meant the resistance is much higher than when cold.  ;)
Hoarder of 8-bit Commodore relics and 1960s Tektronix 500-series stuff. Unconventional interior decorator.
 
The following users thanked this post: tautech

Offline engrguy42

  • Frequent Contributor
  • **
  • Posts: 656
  • Country: us
FWIW, it's a myth that higher efficiency power supplies save you a lot of money on your electric bills. Do the math.
For most people moving from an 80% efficient supply to a 90% efficient supply in their PC will not save much in electric bills, because their PC is only working hard for a small percentage of the time. However, for server, telecoms and other high power applications running flat out 24 hours a day every extra percent of efficiency brings a big enough reduction in power bills to justify significantly higher complexity in their power supplies. Try looking at the materials used to promote multi-kilowatt supplies in the 95% to 97% efficiency area. Its all about cost breakdowns over the life of the equipment.

For consumers a shift from 80% efficiency to 90% efficiency is not useless. Halving the supply's heat output helps compact machines to run cool with a very quiet fan.

If you look hard enough you can always find an exception to anything. Of course, commercial and industrial usage is ENTIRELY different.  We're talking average users, who don't know any better. Commercial and industrial users know exactly what it's costing them.

And it's easy to make vague generalities like "lower heat is better", but in practical terms it's likely irrelevant in terms of effects on actual service life, etc.
- The best engineers know enough to realize they don't know nuthin'...
- Those who agree with you can do no wrong. Those who disagree can do no right.
- I'm always amazed at how many people "already knew that" after you explain it to them in detail...
 

Offline wraper

  • Supporter
  • ****
  • Posts: 17952
  • Country: lv
This is a total wrong argument. The savings on the bill
promoted by such "ratings" 80% 90% gold plus shit are garbage.

Those PSUs are targeted to high end PCs where you use
hungry power GPUs which waste awesome 700W of true power
plus ownership costs.

SOHO and POS are well below 300W and the cost focus
is more on hardware wearing out soon than the mains bill
Total bullshit. Even if computer consumes only 100W, 10% consumption difference is a big deal. And estimation in my previous post is based on such figure for office use. If there is high-end GPU and frequent gaming involved it becomes even more noticeable and more expensive PSU may pay off in less than a year.
 

Offline Monkeh

  • Super Contributor
  • ***
  • Posts: 8135
  • Country: gb
And it's easy to make vague generalities like "lower heat is better", but in practical terms it's likely irrelevant in terms of effects on actual service life, etc.

However the difference in component quality between the low efficiency garbage certain people are promoting and a decently made and efficient supply is relevant in terms of service life.
 

Offline engrguy42

  • Frequent Contributor
  • **
  • Posts: 656
  • Country: us
This is a total wrong argument. The savings on the bill
promoted by such "ratings" 80% 90% gold plus shit are garbage.

Those PSUs are targeted to high end PCs where you use
hungry power GPUs which waste awesome 700W of true power
plus ownership costs.

SOHO and POS are well below 300W and the cost focus
is more on hardware wearing out soon than the mains bill
Total bullshit. Even if computer consumes only 100W, 10% consumption difference is a big deal. And estimation in my previous post is based on such figure for office use. If there is high-end GPU and frequent gaming involved it becomes even more noticeable and more expensive PSU may pay off in less than a year.

Again, do you have ACTUAL numbers to support this, or is it just vague generalizations based on personal belief?
- The best engineers know enough to realize they don't know nuthin'...
- Those who agree with you can do no wrong. Those who disagree can do no right.
- I'm always amazed at how many people "already knew that" after you explain it to them in detail...
 

Offline mikerj

  • Super Contributor
  • ***
  • Posts: 3382
  • Country: gb
This is a total wrong argument. The savings on the bill
promoted by such "ratings" 80% 90% gold plus shit are garbage.

Those PSUs are targeted to high end PCs where you use
hungry power GPUs which waste awesome 700W of true power
plus ownership costs.

SOHO and POS are well below 300W and the cost focus
is more on hardware wearing out soon than the mains bill
Total bullshit. Even if computer consumes only 100W, 10% consumption difference is a big deal. And estimation in my previous post is based on such figure for office use. If there is high-end GPU and frequent gaming involved it becomes even more noticeable and more expensive PSU may pay off in less than a year.

All depends.  A single computer at home used maybe a couple of hour a day or so and the savings will be barely noticeable.  A whole office full of PCs running 24/7 and the savings start to look a bit more compelling.
 

Offline Monkeh

  • Super Contributor
  • ***
  • Posts: 8135
  • Country: gb
I had a machine with an ~80% efficient (at this load) supply which ran 24/7/365, at about 85W at the wall. If I'd had a.. less than stellar supply in it such as those suggested, call it 95W. That amounts to something around £10-20 a year (prices vary wildly, apparently the current average is 14.37p/kWh, or £12.59 a year). That's not that different to the cost difference between a cheap trash supply (about £15-20) and a current 80+Gold supply (about £45). That machine was in said use for four or five years.

Meanwhile, I spent £20 and replaced it with something using 20W at the wall.. paid for itself in months.
« Last Edit: May 24, 2020, 01:30:59 pm by Monkeh »
 

Offline wraper

  • Supporter
  • ****
  • Posts: 17952
  • Country: lv
Again, do you have ACTUAL numbers to support this, or is it just vague generalizations based on personal belief?
Say ((15W (efficiency difference) * 8h * 261 days + 3W * 24h (standby PSU efficiency) * 365 d) * $0.20 kWH = $11.5
 
The following users thanked this post: Siwastaja

Offline engrguy42

  • Frequent Contributor
  • **
  • Posts: 656
  • Country: us
Again, do you have ACTUAL numbers to support this, or is it just vague generalizations based on personal belief?
Say ((15W (efficiency difference) * 8h * 261 days + 3W * 24h (standby PSU efficiency) * 365 d) * $0.20 kWH = $11.5

So you're running your computer at full power for 8 hrs/day, 365 days/year?
- The best engineers know enough to realize they don't know nuthin'...
- Those who agree with you can do no wrong. Those who disagree can do no right.
- I'm always amazed at how many people "already knew that" after you explain it to them in detail...
 

Offline coppice

  • Super Contributor
  • ***
  • Posts: 10035
  • Country: gb
Again, do you have ACTUAL numbers to support this, or is it just vague generalizations based on personal belief?
Say ((15W (efficiency difference) * 8h * 261 days + 3W * 24h (standby PSU efficiency) * 365 d) * $0.20 kWH = $11.5

So you're running your computer at full power for 8 hrs/day, 365 days/year?
Haven't you seen how gamers live?  :)
 

Offline wraper

  • Supporter
  • ****
  • Posts: 17952
  • Country: lv
Again, do you have ACTUAL numbers to support this, or is it just vague generalizations based on personal belief?
Say ((15W (efficiency difference) * 8h * 261 days + 3W * 24h (standby PSU efficiency) * 365 d) * $0.20 kWH = $11.5

So you're running your computer at full power for 8 hrs/day, 365 days/year?
Check the numbers again, there are 261 days. 365 for standby part. If it was my computer, difference would be much higher since I run it about 12-14 hours a day, and especially if I ran it "at full power" since it's high end PC with high-end GPU.
 

Offline engrguy42

  • Frequent Contributor
  • **
  • Posts: 656
  • Country: us
My suggestion for those who really want to understand computer power supply and GPU power usage...

Get a cheap kWH monitoring device (like a KillAWatt), and plug your computer into it. It will tell you the ACTUAL kWH usage of your entire computer (including GPU) over time. You can then calculate the difference if it had a different efficiency power supply, and calculate a more realistic value of cost savings.

GPU's and other computer components don't use a fixed amount of power during the day, nor do they operated at maximum ratings all day. It varies. And in some/many cases they never reach their rated power specification, depending on the software they're running.

You may be surprised that the cost savings you thought you were getting just aren't there. 

- The best engineers know enough to realize they don't know nuthin'...
- Those who agree with you can do no wrong. Those who disagree can do no right.
- I'm always amazed at how many people "already knew that" after you explain it to them in detail...
 
The following users thanked this post: PKTKS

Offline coppice

  • Super Contributor
  • ***
  • Posts: 10035
  • Country: gb
My suggestion for those who really want to understand computer power supply and GPU power usage...

Get a cheap kWH monitoring device (like a KillAWatt), and plug your computer into it. It will tell you the ACTUAL kWH usage of your entire computer (including GPU) over time. You can then calculate the difference if it had a different efficiency power supply, and calculate a more realistic value of cost savings.

GPU's and other computer components don't use a fixed amount of power during the day, nor do they operated at maximum ratings all day. It varies. And in some/many cases they never reach their rated power specification, depending on the software they're running.

You may be surprised that the cost savings you thought you were getting just aren't there.
I'd recommend any serious engineer to get a KillAWatt or two, and monitor some of the things around their home. Anyone competent knows the approximate power consumption of most of the stuff around the house, but usage patterns determine the total energy consumed. Those patterns are not always obvious until you start making some measurements. It can be quite revealing.
 

Offline PKTKS

  • Super Contributor
  • ***
  • Posts: 1766
  • Country: br
(..)
You may be surprised that the cost savings you thought you were getting just aren't there.

Very true indeed.

Add to the list of costs a REQUIRED TRUE SINE INVERTER
for those setups using PFC based sensitive loads.

A couple of months a guy called me asking my opinion
about someone trying to sell him a "PC" above the barrel...
with multi-core i9 power latest gen GPU and awesome 1000W PSU
Which included a TRUE SINE INVERTER NoBreak of 2KW

Just to run MS word on a desktop dozen times a day.
The folk argument is that MS Win GUI now requires a GPU
for proper display office and an i9 multi-core CPU with
liquid cooling is required to handle ANTI VIRUS stuff

Of course the gain margin of the folk was  considerable as well

I gave him my opinion which is absolute unprintable here...

Paul

 

Offline Monkeh

  • Super Contributor
  • ***
  • Posts: 8135
  • Country: gb
You may be surprised that the cost savings you thought you were getting just aren't there.

The figures I measured were at idle of a configured system. Consumption simply will not have been lower at any time. I'm sorry this doesn't support your argument, but that's the problem with biases.
 

Offline Mechatrommer

  • Super Contributor
  • ***
  • Posts: 11714
  • Country: my
  • reassessing directives...
Get a cheap kWH monitoring device (like a KillAWatt), and plug your computer into it.
I'd recommend any serious engineer to get a KillAWatt or two, and monitor some of the things around their home.
I'd recommend Uni-T UT210E can be used to design SMPS as well.. ;D
Nature: Evolution and the Illusion of Randomness (Stephen L. Talbott): Its now indisputable that... organisms “expertise” contextualizes its genome, and its nonsense to say that these powers are under the control of the genome being contextualized - Barbara McClintock
 

Offline engrguy42

  • Frequent Contributor
  • **
  • Posts: 656
  • Country: us
You may be surprised that the cost savings you thought you were getting just aren't there.

The figures I measured were at idle of a configured system. Consumption simply will not have been lower at any time. I'm sorry this doesn't support your argument, but that's the problem with biases.

I wasn't responding to you, and honestly I'm not sure what point you were trying to make. And my only argument is to suggest people look at the facts by actually measuring rather than supposing. Honestly I couldn't care less what anyone here chooses to do.
- The best engineers know enough to realize they don't know nuthin'...
- Those who agree with you can do no wrong. Those who disagree can do no right.
- I'm always amazed at how many people "already knew that" after you explain it to them in detail...
 

Offline Monkeh

  • Super Contributor
  • ***
  • Posts: 8135
  • Country: gb
Add to the list of costs a REQUIRED TRUE SINE INVERTER
for those setups using PFC based sensitive loads.

And yet I achieve uptimes measured in years with no REQUIRED TRUE SINE INVERTER in sight. No INVERTER at all, actually. Or anything else in all-caps you might want to mention..
 
The following users thanked this post: Siwastaja

Offline coppice

  • Super Contributor
  • ***
  • Posts: 10035
  • Country: gb
Add to the list of costs a REQUIRED TRUE SINE INVERTER
for those setups using PFC based sensitive loads.
And yet I achieve uptimes measured in years with no REQUIRED TRUE SINE INVERTER in sight. No INVERTER at all, actually. Or anything else in all-caps you might want to mention..
Many countries are not familiar with a public power supply more reliable than the bulk of UPS solutions.
 

Offline Siwastaja

  • Super Contributor
  • ***
  • Posts: 9336
  • Country: fi
Why do we have this inrush of 11-year-old* gaming kids spewing 500 posts worth of arrogant bullshit in no time, on such fine engineering forum?

*) Physical or mental, doesn't matter

Is it the new general computing section?
 

Offline nfmax

  • Super Contributor
  • ***
  • Posts: 1624
  • Country: gb
Add to the list of costs a REQUIRED TRUE SINE INVERTER
for those setups using PFC based sensitive loads.
And yet I achieve uptimes measured in years with no REQUIRED TRUE SINE INVERTER in sight. No INVERTER at all, actually. Or anything else in all-caps you might want to mention..
Many countries are not familiar with a public power supply more reliable than the bulk of UPS solutions.

Though even there, it may not be TRUE SINE  mains anyway:

https://www.eevblog.com/forum/projects/show-us-your-mains-waveform!/
 

Offline Monkeh

  • Super Contributor
  • ***
  • Posts: 8135
  • Country: gb
Add to the list of costs a REQUIRED TRUE SINE INVERTER
for those setups using PFC based sensitive loads.
And yet I achieve uptimes measured in years with no REQUIRED TRUE SINE INVERTER in sight. No INVERTER at all, actually. Or anything else in all-caps you might want to mention..
Many countries are not familiar with a public power supply more reliable than the bulk of UPS solutions.

Indeed. But that does not mean everyone using a computer globally requires a UPS.
 

Offline wizard69

  • Super Contributor
  • ***
  • Posts: 1184
  • Country: us
Really, is it that hard to figure it out?

I'd like to know details: is it for capacitors? if so, which is the difference between "good" capacitors and "bad" capacitors? And is there any table around telling which are good ones (vendor/models/etc)?
Why do you want to know, cheap or scam hardware is what it is.   A bad supply can be the result of many things from one substandard component to the whole thing being built to one standard with a label slapped on it indicating another capability.   In some cases it is a simple as that, a label gets slapped on a device that does not meet the devices designed specs.
Quote
Is it for the transformer? Filters? Protection chip? Oscillators? What does make a PSU a "bad" PSU?

It's too easy to say "oh, because it's cheap". Cheap means all and nothing.

Cheap means everything.   It can mean cutting corners.   It can mean out right lies about a products capability.   Or it can simply mean that the cheapest parts possible got embedded in the PSU.   This really isn't hard to understand.
 

Offline wizard69

  • Super Contributor
  • ***
  • Posts: 1184
  • Country: us
Add to the list of costs a REQUIRED TRUE SINE INVERTER
for those setups using PFC based sensitive loads.
And yet I achieve uptimes measured in years with no REQUIRED TRUE SINE INVERTER in sight. No INVERTER at all, actually. Or anything else in all-caps you might want to mention..
Many countries are not familiar with a public power supply more reliable than the bulk of UPS solutions.

Indeed. But that does not mean everyone using a computer globally requires a UPS.

More importantly people can buy laptops with batteries included if power is a real issue.    Cheaper than a UPS.    Marginal utilities can be a problem anywhere and as such laptops are a good way to deal with that.    Yes I realize that a laptop does not replace a workstation performance wise but that is another issue.   

To pull the discussion back on track a bit laptop power adapters are not the most reliable things either.    Again if varies from manufacture to manufacture but one can see awfully high failure rates in these adapters.
 

Offline Monkeh

  • Super Contributor
  • ***
  • Posts: 8135
  • Country: gb
To pull the discussion back on track a bit laptop power adapters are not the most reliable things either.    Again if varies from manufacture to manufacture but one can see awfully high failure rates in these adapters.

Consequence of the primary design compromises: Size and cost. That and being left on carpets, getting blankets tossed on them, and so forth.

IME the ones used on cheap tat like MSI and Acer aren't stellar, the ones supplied with the business grade machines tend to be tougher.
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf