@Fungus
The working voltage.
So ... CAT III 1000V is better than CAT IV 600V?
Why would they even bother with CAT IV then?
@Fungus
The working voltage.
So ... CAT III 1000V is better than CAT IV 600V?
Why would they even bother with CAT IV then?
Read carefully the paper ...
(But, reading carrefully that good paper will not increase your posts counter ...)
@Fungus
Maximum working voltage is maximum voltage that can be applied to instrument in normal conditions on a permanent basis.
It is voltage that will not cause any deterioration of instrument by gradual failure of components inside, and will protect user while measuring, if good practice is followed.
So, If highest voltage you are about to measure is 400V, that is your working voltage. And then you buy instrument rated 600V just to be on the safe side..
CAT II, III, IV is resilience to overvoltage transients, and is related to environment you are working in. If you have 12V off the grid solar system on a ranch in country, and have long cables that go overhead, in stormy weather those can pickup huge transients....
So in that case you would need a CAT IV meter while working voltage of 24V would be sufficient, if such thing existed....
By same token, if you are repairing 800V tube amplifiers in your well protected lab in a city, you might need voltmeter with 1000V working voltage, and CAT II, or even CAT I might be fine for you.
Simple as that. Well, not really, but that is basic logic. Trick is that most of the time it is hard to know in what environment you are/will be working, so people prefer to err on the safe side.
Regards,
Sinisa
@Fungus
The working voltage.
So ... CAT III 1000V is better than CAT IV 600V?
Why would they even bother with CAT IV then?
Read carefully the paper ...
(But, reading carrefully that good paper will not increase your posts counter ...)
Sigh. I'll risk increasing my post counter and try to make it more obvious.
The 'paper' says:
Now ... if we look at the table the source impedance for CAT III and CAT IV
is the same so that's not the answer!
The
only thing The Paper says about CAT IV is this:
Now, I'm not completely lazy so
I looked up IEC 1010, here:
http://www.ni.com/white-paper/2827/en/The only thing it says about CAT IV is:
Problem: I don't know what the "other documents" (the ones that cover CAT IV) are, so I thought I'd ask here.
So....
Question:What's the difference between "CAT III 1000V" and "CAT IV 600V"?
ie. If the "working voltage" of CAT III 1000V is higher and the transients are the same, what's CAT IV all about?
(And why is there no such thing as "CAT IV 1000V"?)
(And why is there no such thing as "CAT IV 1000V"?)
There is. Bryman has few meters with that rating. Fluke didn't put it in their whitepaper, probably because they don't have one...
CAT IV 1000V is rated for 1000V continuous and 12000V transient... That's your difference...
Table attached, source National Instruments...
Cheers.
(And why is there no such thing as "CAT IV 1000V"?)
There is. Bryman has few meters with that rating. Fluke didn't put it in their whitepaper, probably because they don't have one...
OK.
CAT IV 1000V is rated for 1000V continuous and 12000V transient... That's your difference...
In that case the question becomes: Is CAT III 1000V better than CAT IV 600V?
(It has a higher working voltage!)
Like I said before, if you don't work on installations directly connected on distribution networks or outside, and you work only in lab, than CAT II is probably enough, as far as overvoltage transient protection is concerned. And if you work with 800V valve amplifiers in your lab, yes, then 1000 CAT II is better than 600V CAT IV....
The 121GW goes back in time.
Excellent work Joe.
Thanks for your time and effort.
3DB.
Like I said before, if you don't work on installations directly connected on distribution networks or outside, and you work only in lab, than CAT II is probably enough, as far as overvoltage transient protection is concerned. And if you work with 800V valve amplifiers in your lab, yes, then 1000 CAT II is better than 600V CAT IV....
OK, I think I get it:
They were drawing a table, four categories, 150V/300V/600V/1000V in each category, stepping up the transients, etc.
Then they ran into a limit, they couldn't decrease the source impedance for CAT IV, it got stuck at 2 Ohms!
Result: CAT III 1000V is better than CAT IV 600V.
Is that right?
(and is my Fluke 27, CAT III 1000V really a CAT IV 600V meter, only that it was made before ISO61010-2nd Edition)
Result: CAT III 1000V is better than CAT IV 600V.
Not really, for indoor work in industrial condition you need at least CAT III for outdoor work you need CAT IV, i.e. the same meter can be used up to 600V outdoor and to 1000V indoor.
I think a better description is
Cat III is for after the mains fuse board instead of indoor
Cat IV is for before mains fuse board instead of outdoor
As someone else pointed out, it is to do with the amount of energy available at that point.
ie after the mains fuse board, the fuse should blow and limit the energy
Before the fuse board, you are relying on the substation trips - how many Amps are they going to trip at
I think a better description is
Cat III is for after the mains fuse board instead of indoor
Cat IV is for before mains fuse board instead of outdoor
Yes, I know what the description says.
What's the physical difference inside the meter?
Put another way:
Is it possible to make a CAT III 1000V meter which
isn't a CAT IV 600V meter?
What's the physical difference inside the meter?
Why do you want a difference inside the meter, a 8000V 2ohm transient test qualifies for usage at 1000V indoor (or after the mains breaker) and 600V outdoor (or before the mains breaker).
What's the physical difference inside the meter?
Why do you want a difference inside the meter, a 8000V 2ohm transient test qualifies for usage at 1000V indoor (or after the mains breaker) and 600V outdoor (or before the mains breaker).
I don't
want a difference, I want to know if there
is a difference.
(consensus seems to be "no").
I don't want a difference, I want to know if there is a difference.
As someone else pointed out, it is to do with the amount of energy available at that point.
ie after the mains fuse board, the fuse should blow and limit the energy
Before the fuse board, you are relying on the substation trips - how many Amps are they going to trip at
The CAT ratings tell you to what voltage (silly really, should be transient power) the meter will handle.
So both the CAT III and CAT IV ratings apply.
With the CAT III rating the meter can withstand a higher voltage because the amount of power is limited by the local fuse box.
With CAT IV rating the meter withstands a lower voltage because it is now relying on the substation (or whatever) to limit the power.
The confusion seems to be exacerbated by strictly comparing numbers. However, the CAT ratings take into account more than just the test numbers. Safety also takes into consideration the operating environment. Since using a meter in a CATIV environment has the potential (heh) to deliver more energy than a CATIII environment, and may be more likely to do so, a greater margin is imposed in the so-called limits by lowering the maximum voltage.
A DMM that passes the 8000-volt 2-ohm transient test is deemed safe at 1000V in environments up to CATIII. When going outside the building to a CATIV environment, the same DMM probably would be fine at 1000V, but the maximum voltage "allowed" is 600 V for additional safety margin.
What's the physical difference inside the meter?
Why do you want a difference inside the meter, a 8000V 2ohm transient test qualifies for usage at 1000V indoor (or after the mains breaker) and 600V outdoor (or before the mains breaker).
I don't want a difference, I want to know if there is a difference.
(consensus seems to be "no").
The closer you work to the grid (working environment as mentioned) the higher the risk of transients and the greater the risk to the meter and user. Meter voltage derating.
Power network fuses are larger, supply cabling heavier and so on.
Think
impedance of supply.
OMG, I think I've seen 121gw's guts on video... If you still want to keep internals in secret you have to remove the video... and deal with 778 viewers (so far).
Meter voltage derating.
Yes, "derating". That's the term I was trying to think of earlier.
FWIW The Fluke 189 has 1000 volts Cat 111 rating on the meter itself, no mention of Cat 4..
It depends on when it was made. The first version of the standard only went up to CAT III. Cat IV was added in the second edition.
Fungus, I think if you wanted to figure this out, the place to start is by writing one or more manufactures and seeing what they have to say. Certainly they are the experts.
It should be obvious that the higher the CAT rating, the more energy available. Assuming you would understand why they would derate the meter at higher CAT ratings and you really are just asking if any CAT III rated meter is automatically rated to CAT IV 600 and if not, what is the difference.
Again, turn to the experts but the first thing I would consider (a guess on my part) is that the fuses used would be rated to break a higher energy circuit for CAT IV than CAT III. Maybe for a CAT III environment for example, they use a 1KV AC/DC 10KA rated fuses. For CAT IV 600, they may require 20KA and CAT IV 1000 maybe 30KA. Again, ask the experts. I am just guessing.
How did your relay drop test ever work out? Did you ever buy any and try to get them to change states? I have not heard any more from Gossen but I expect they don't move very fast.
OMG, I think I've seen 121gw's guts on video... If you still want to keep internals in secret you have to remove the video... and deal with 778 viewers (so far).
Good luck reverse engineering it from that video.
Dave had made a video some time back and I think posted a few pictures of it. I don't think he would want me diving into details about the design and make this public.