Author Topic: Interpretation of Equipment Rating Labels  (Read 669 times)

0 Members and 1 Guest are viewing this topic.

Offline LesoleeTopic starter

  • Regular Contributor
  • *
  • Posts: 223
  • Country: gb
    • publications list
Interpretation of Equipment Rating Labels
« on: January 17, 2020, 02:26:03 pm »
I have just put one of our products up for external safety assessment. It is not mandatory, and is very expensive, so I have never done this before. Ordinarily I just apply the relevant standards (as I understand them). In this case, since EN60950-1 is obsolete, the relevant standard is BS EN 62368-1:2014.

I have the rating 90 - 264 V  on the product as it is intended for worldwide use. That has been a standard range for a long time. No, says the accredited test house, you have to label it as 100 – 240 V because we are going to test to   100 V – 10% and 240 V + 10% because that’s what the standard says. WHAT? Where does it say that? Well it doesn’t. They showed me the same clause I had been reading, but they are interpreting it quite differently.

My question is: If the product is rated 100 – 240 V, would you be happy to run it at a measured supply rail of 250 V. Or how about at 95 V? Would you think this was reasonable?


A little trip down memory lane for those who are (still) interested. In the “old days” we used to have transformers and linear power supplies. There were taps on the transformers, and you would set the input to 240 V; 230 V; 220 V, and so on. It would have been very reasonable to apply a ±10% tolerance to each tap setting. In my first job (circa 1982), designing premium DVMs, there was no way anyone was putting a switched-mode power supply into a 7-digit DVM. Too noisy.

Fast forward to 1986 (designing oscilloscopes) and I designed the first switched mode supply used on a Gould scope, the 400. This used a mains transformer tapped to 115 V and 230 V ranges, and a switcher to avoid intermediate tap changing. So you had two ranges, 115 V and 230 V which meant: 90 – 130 V and 190 – 265 V.

Fast forward to now and everyone uses switchers for everything. 90 – 264 V is a common specification you see. But what does it mean? According to EN62368-1:2014 clause F.3.3.4 the marking means:
“a range indicated by minimum and maximum values separated by a hyphen”
Seems reasonable. I would not test beyond the stated minimum or maximum values because, well they are maximum and minimum values!

So we have clause in question:
Quote
B.2.3 Supply voltage
In determining the most unfavourable supply voltage for a test, the following variables shall be taken into account:

– multiple rated voltages;
– extremes of rated voltage ranges; and
– tolerance on rated voltage as declared by the manufacturer.

Unless the manufacturer declares a wider tolerance, the minimum tolerance shall be taken as +10 % and −10 % for a.c. mains and +20 % and −15 % for d.c. mains. Equipment intended by the manufacturer to be restricted to connection to a conditioned power supply system (for example, a UPS) may be provided with a narrower tolerance if the equipment is also provided with instructions specifying such restriction.

The accredited test house is interpreting this clause to mean that they should (must) apply 10% tolerances to the stated max and min values. I don’t read it that way, but they do, … and they are to be obeyed!

I have searched my house and found loads of plug-top power supplies and those with universal input are all rated 100 – 240 V. But they are probably all externally approved as well. It seems to me that accredited test houses apply this (to me, arbitrary) tolerance to the voltage range, and equipment not certified by external agencies would not interpret the standard in that way.

If I saw one thing rated 100 – 240 V and another at 90 – 264 V, I would be inclined to think that the 90 – 264 V one was more broad ranging. And yet it seems that this is merely a different interpretation of the same standard. That means those using the accredited approach are potentially at a commercial disadvantage  >:(
 

Offline Benta

  • Super Contributor
  • ***
  • Posts: 6257
  • Country: de
Re: Interpretation of Equipment Rating Labels
« Reply #1 on: January 17, 2020, 09:20:47 pm »
I've just taken a look at six different wall warts, chargers and test equipment in my household, all from reputable manufacturers (HP, Canon, Dell, IBM, Agilent).
They all specify input voltage as 100-240 V~.
It seems your test house is right.

 
The following users thanked this post: Lesolee

Offline Vovk_Z

  • Super Contributor
  • ***
  • Posts: 1458
  • Country: ua
Re: Interpretation of Equipment Rating Labels
« Reply #2 on: January 17, 2020, 10:16:35 pm »
Rsted voltage is one thing, but real working voltage is other thing.
For example, if rated voltage is 100 V, then real working range could be 90 - 110 V or other.
 
The following users thanked this post: Lesolee

Offline Cerebus

  • Super Contributor
  • ***
  • Posts: 10576
  • Country: gb
Re: Interpretation of Equipment Rating Labels
« Reply #3 on: January 17, 2020, 10:45:50 pm »
I would interpret that as, the maxima and minima to be tested are the rated min voltage - 10% and rated max voltage + 10%; and that the rated voltages are what are printed on the formal rating label. So, yeah, I think the test house are right.

Note that the standard gives you the option of "the manufacturer declares a wider tolerance" but not an option of declaring a narrower tolerance, otherwise you could have had your cake and eat it by rating at "90 - 264V ±0%".
Anybody got a syringe I can use to squeeze the magic smoke back into this?
 
The following users thanked this post: Lesolee


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf